Metropolis-Hastings algorithms with adaptive proposals

被引:30
|
作者
Cai, Bo [1 ]
Meyer, Renate [2 ]
Perron, Francois [3 ]
机构
[1] Univ S Carolina, Dept Epidemiol & Biostat, Columbia, SC 29208 USA
[2] Univ Auckland, Dept Stat, Auckland 1, New Zealand
[3] Univ Montreal, Dept Math & Stat, Montreal, PQ H3C 3J7, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Adaptive rejection Metropolis sampling; Bayesian inference; Markov chain Monte Carlo; Non-conjugate distribution; State-space model;
D O I
10.1007/s11222-008-9051-5
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Different strategies have been proposed to improve mixing and convergence properties of Markov Chain Monte Carlo algorithms. These are mainly concerned with customizing the proposal density in the Metropolis-Hastings algorithm to the specific target density and require a detailed exploratory analysis of the stationary distribution and/or some preliminary experiments to determine an efficient proposal. Various Metropolis-Hastings algorithms have been suggested that make use of previously sampled states in defining an adaptive proposal density. Here we propose a general class of adaptive Metropolis-Hastings algorithms based on Metropolis-Hastings-within-Gibbs sampling. For the case of a one-dimensional target distribution, we present two novel algorithms using mixtures of triangular and trapezoidal densities. These can also be seen as improved versions of the all-purpose adaptive rejection Metropolis sampling (ARMS) algorithm to sample from non-logconcave univariate densities. Using various different examples, we demonstrate their properties and efficiencies and point out their advantages over ARMS and other adaptive alternatives such as the Normal Kernel Coupler.
引用
收藏
页码:421 / 433
页数:13
相关论文
共 50 条
  • [31] Efficient parallelisation of Metropolis-Hastings algorithms using a prefetching approach
    Strid, Ingvar
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2010, 54 (11) : 2814 - 2835
  • [32] On the Geometric Ergodicity of Metropolis-Hastings Algorithms for Lattice Gaussian Sampling
    Wang, Zheng
    Ling, Cong
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (02) : 738 - 751
  • [33] Scaling limits for the transient phase of local Metropolis-Hastings algorithms
    Christensen, OF
    Roberts, GO
    Rosenthal, JS
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2005, 67 : 253 - 268
  • [34] The Implicit Metropolis-Hastings Algorithm
    Neklyudov, Kirill
    Egorov, Evgenii
    Vetrov, Dmitry
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [35] UNDERSTANDING THE METROPOLIS-HASTINGS ALGORITHM
    CHIB, S
    GREENBERG, E
    AMERICAN STATISTICIAN, 1995, 49 (04): : 327 - 335
  • [36] A history of the Metropolis-Hastings algorithm
    Hitchcock, DB
    AMERICAN STATISTICIAN, 2003, 57 (04): : 254 - 257
  • [37] Variance reduction of estimators arising from Metropolis-Hastings algorithms
    Iliopoulos, George
    Malefaki, Sonia
    STATISTICS AND COMPUTING, 2013, 23 (05) : 577 - 587
  • [38] Log-concave sampling: Metropolis-Hastings algorithms are fast
    Dwivedi, Raaz
    Chen, Yuansi
    Wainwright, Martin J.
    Yu, Bin
    Journal of Machine Learning Research, 2019, 20
  • [39] Metropolis-Hastings sampling of paths
    Flotterod, Gunnar
    Bierlaire, Michel
    TRANSPORTATION RESEARCH PART B-METHODOLOGICAL, 2013, 48 : 53 - 66
  • [40] Stability of noisy Metropolis-Hastings
    Medina-Aguayo, F. J.
    Lee, A.
    Roberts, G. O.
    STATISTICS AND COMPUTING, 2016, 26 (06) : 1187 - 1211