Particle Gibbs with Ancestor Sampling

被引:0
|
作者
Lindsten, Fredrik [1 ,2 ]
Jordan, Michael I. [3 ,4 ]
Schon, Thomas B. [5 ]
机构
[1] Univ Cambridge, Dept Engn, Cambridge CB2 1PZ, England
[2] Linkoping Univ, Div Automat Control, S-58183 Linkoping, Sweden
[3] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
[4] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
[5] Uppsala Univ, Dept Informat Technol, S-75105 Uppsala, Sweden
基金
瑞典研究理事会; 英国工程与自然科学研究理事会;
关键词
particle Markov chain Monte Carlo; sequential Monte Carlo; Bayesian inference; non-Markovian models; state-space models; STOCHASTIC-APPROXIMATION; SIMULATION METHODS; INFERENCE; TIME; PREDICTION; VOLATILITY; LIKELIHOOD; FILTERS; MODELS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a new PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an off-the-shelf class of Markov kernels that can be used to simulate, for instance, the typically high-dimensional and highly autocorrelated state trajectory in a state-space model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in state-space models, but also in models with more complex dependencies, such as non-Markovian, Bayesian nonparametric, and general probabilistic graphical models.
引用
收藏
页码:2145 / 2184
页数:40
相关论文
共 50 条
  • [1] Particle gibbs with ancestor sampling
    [J]. 1600, Microtome Publishing (15):
  • [2] Particle Gibbs with Ancestor Sampling for Probabilistic Programs
    van de Meent, Jan-Willem
    Yang, Hongseok
    Mansinghka, Vikash
    Wood, Frank
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 986 - 994
  • [3] REJECTION-SAMPLING-BASED ANCESTOR SAMPLING FOR PARTICLE GIBBS
    Hostettler, Roland
    Sarkka, Simo
    [J]. 2019 IEEE 29TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2019,
  • [4] Replica Exchange Particle-Gibbs Method with Ancestor Sampling
    Inoue, Hiroaki
    Hukushima, Koji
    Omori, Toshiaki
    [J]. JOURNAL OF THE PHYSICAL SOCIETY OF JAPAN, 2020, 89 (10)
  • [5] Particle Gibbs with Ancestor Sampling for Identification of Tire-Friction Parameters
    Berntorp, Karl
    Di Cairano, Stefano
    [J]. IFAC PAPERSONLINE, 2017, 50 (01): : 14849 - 14854
  • [6] Flexible model comparison of unobserved components models using particle Gibbs with ancestor sampling
    Nonejad, Nima
    [J]. ECONOMICS LETTERS, 2015, 133 : 35 - 39
  • [7] On particle Gibbs sampling
    Chopin, Nicolas
    Singh, Sumeetpal S.
    [J]. BERNOULLI, 2015, 21 (03) : 1855 - 1883
  • [8] Parameter elimination in particle Gibbs sampling
    Wigren, Anna
    Risuleo, Riccardo Sven
    Murray, Lawrence
    Lindsten, Fredrik
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [9] Kernel Smoothing Conditional Particle Filter With Ancestor Sampling
    El Kolei, Salima
    Navarro, Fabien
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 3380 - 3392
  • [10] Particle Gibbs with ancestor sampling for stochastic volatility models with: heavy tails, in mean effects, leverage, serial dependence and structural breaks
    Nonejad, Nima
    [J]. STUDIES IN NONLINEAR DYNAMICS AND ECONOMETRICS, 2015, 19 (05): : 561 - 584