Generalized Bayesian likelihood-free inference

被引:0
|
作者
Pacchiardi, Lorenzo [1 ]
Khoo, Sherman [2 ]
Dutta, Ritabrata [3 ]
机构
[1] Univ Cambridge, Leverhulme Ctr Future Intelligence, Cambridge, England
[2] Univ Bristol, Sch Math, Bristol, England
[3] Univ Warwick, Dept Stat, Warwick, England
来源
ELECTRONIC JOURNAL OF STATISTICS | 2024年 / 18卷 / 02期
基金
英国工程与自然科学研究理事会;
关键词
Likelihood-free inference; generalized Bayes; scoring rules; pseudo-marginal MCMC; COMPUTATION;
D O I
10.1214/24-EJS2283
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Generalized Bayesian inference replaces the likelihood in the Bayesian posterior with the exponential of a loss function connecting parameter values and observations. As a loss function, it is possible to use Scoring Rules (SRs), which evaluate the match between the observation and the probabilistic model for given parameter values. In this work, we leverage this Scoring Rule posterior for Bayesian Likelihood-Free Inference (LFI). In LFI, we can sample from the model but not evaluate the likelihood; hence, we use the Energy and Kernel SRs in the SR posterior, as they admit unbiased empirical estimates. While traditional Pseudo-Marginal (PM) Markov Chain Monte Carlo (MCMC) can be applied to the SR posterior, it mixes poorly for concentrated targets, such as those obtained with many observations. As such, we propose to use Stochastic Gradient (SG) MCMC, which improves performance over PM-MCMC and scales to higher-dimensional setups as it is rejection-free. SG-MCMC requires differentiating the simulator model; we achieve this effortlessly by implementing the simulator models using automatic differentiation libraries. We compare SG-MCMC sampling for the SR posterior with related LFI approaches and find that the former scales to larger sample sizes and works well on the raw data, while other methods require determining suitable summary statistics. On a chaotic dynamical system from meteorology, our method even allows inferring the parameters of a neural network used to parametrize a part of the update equations.
引用
收藏
页码:3628 / 3686
页数:59
相关论文
共 50 条
  • [1] Bayesian optimization for likelihood-free cosmological inference
    Leclercq, Florent
    [J]. PHYSICAL REVIEW D, 2018, 98 (06)
  • [2] Likelihood-free Bayesian inference for α-stable models
    Peters, G. W.
    Sisson, S. A.
    Fan, Y.
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2012, 56 (11) : 3743 - 3756
  • [3] Modularized Bayesian analyses and cutting feedback in likelihood-free inference
    Chakraborty, Atlanta
    Nott, David J.
    Drovandi, Christopher C.
    Frazier, David T.
    Sisson, Scott A.
    [J]. STATISTICS AND COMPUTING, 2023, 33 (01)
  • [4] Modularized Bayesian analyses and cutting feedback in likelihood-free inference
    Atlanta Chakraborty
    David J. Nott
    Christopher C. Drovandi
    David T. Frazier
    Scott A. Sisson
    [J]. Statistics and Computing, 2023, 33
  • [5] Robust and integrative Bayesian neural networks for likelihood-free parameter inference
    Wrede, Fredrik
    Eriksson, Robin
    Jiang, Richard
    Petzold, Linda
    Engblom, Stefan
    Hellander, Andreas
    Singh, Prashant
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [6] Likelihood-free inference via classification
    Gutmann, Michael U.
    Dutta, Ritabrata
    Kaski, Samuel
    Corander, Jukka
    [J]. STATISTICS AND COMPUTING, 2018, 28 (02) : 411 - 425
  • [7] Likelihood-Free Inference by Ratio Estimation
    Thomas, Owen
    Dutta, Ritabrata
    Corander, Jukka
    Kaski, Samuel
    Gutmann, Michael U.
    [J]. BAYESIAN ANALYSIS, 2022, 17 (01): : 1 - 31
  • [8] pyABC: distributed, likelihood-free inference
    Klinger, Emmanuel
    Rickert, Dennis
    Hasenauer, Jan
    [J]. BIOINFORMATICS, 2018, 34 (20) : 3591 - 3593
  • [9] ELFI: Engine for Likelihood-Free Inference
    Lintusaari, Jarno
    Vuollekoski, Henri
    Kangasraasio, Antti
    Skyten, Kusti
    Jarvenpaa, Marko
    Marttinen, Pekka
    Gutmann, Michael U.
    Vehtari, Aki
    Corander, Jukka
    Kaski, Samuel
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 19
  • [10] Expectation Propagation for Likelihood-Free Inference
    Barthelme, Simon
    Chopin, Nicolas
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2014, 109 (505) : 315 - 333