Particle Gibbs with Ancestor Sampling

被引:0
|
作者
Lindsten, Fredrik [1 ,2 ]
Jordan, Michael I. [3 ,4 ]
Schon, Thomas B. [5 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
[2] Linkoping Univ, Div Automat Control, S-58183 Linkoping, Sweden
[3] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
[4] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
[5] Uppsala Univ, Dept Informat Technol, S-75105 Uppsala, Sweden
基金
瑞典研究理事会; 英国工程与自然科学研究理事会;
关键词
particle Markov chain Monte Carlo; sequential Monte Carlo; Bayesian inference; non-Markovian models; state-space models; STOCHASTIC-APPROXIMATION; SIMULATION METHODS; INFERENCE; TIME; PREDICTION; VOLATILITY; LIKELIHOOD; FILTERS; MODELS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a new PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an off-the-shelf class of Markov kernels that can be used to simulate, for instance, the typically high-dimensional and highly autocorrelated state trajectory in a state-space model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in state-space models, but also in models with more complex dependencies, such as non-Markovian, Bayesian nonparametric, and general probabilistic graphical models.
引用
收藏
页码:2145 / 2184
页数:40
相关论文
共 50 条
  • [21] Insufficient Gibbs sampling
    Luciano, Antoine
    Robert, Christian P.
    Ryder, Robin J.
    [J]. STATISTICS AND COMPUTING, 2024, 34 (04)
  • [22] Recycling Gibbs Sampling
    Martino, Luca
    Elvira, Victor
    Camps-Valls, Gustau
    [J]. 2017 25TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2017, : 171 - 175
  • [23] Attention as Gibbs sampling
    Baddeley, Roland
    Stuijzand, Bobby
    [J]. PERCEPTION, 2015, 44 : 372 - 372
  • [24] Gibbs Sampling with People
    Harrison, Peter M. C.
    Marjieh, Raja
    Adolfi, Federico
    Van Rijn, Pol
    Anglada-Tort, Manuel
    Tchernichovski, Ofer
    Larrouy-Maestri, Pauline
    Jacoby, Nori
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [25] Asynchronous Gibbs sampling
    Terenin, Alexander
    Simpson, Daniel
    Draper, David
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [26] Combing Gibbs-sampling with Adaptive Particle Swarm for Large Scale Global Optimization
    Wang, Minmin
    Jiang, Min
    [J]. PROCEEDINGS OF 2018 TENTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2018, : 856 - 860
  • [27] Parallelized sampling of the Gibbs ensemble
    Strnad, M
    Nezbeda, I
    [J]. MOLECULAR PHYSICS, 2000, 98 (22) : 1887 - 1894
  • [28] GIBBS SAMPLING IN BAYESIAN NETWORKS
    HRYCEJ, T
    [J]. ARTIFICIAL INTELLIGENCE, 1990, 46 (03) : 351 - 363
  • [29] A Gibbs sampling approach to cointegration
    Bauwens, L
    Giot, P
    [J]. COMPUTATIONAL STATISTICS, 1998, 13 (03) : 339 - 368
  • [30] GIBBS SAMPLING FOR SARMA MODELS
    Ismail, Mohamed A.
    Amin, Ayman A.
    [J]. PAKISTAN JOURNAL OF STATISTICS, 2014, 30 (02): : 153 - 168