Split Hamiltonian Monte Carlo

被引:34
|
作者
Shahbaba, Babak [1 ,2 ]
Lan, Shiwei [1 ]
Johnson, Wesley O. [1 ]
Neal, Radford M. [3 ,4 ]
机构
[1] Univ Calif Irvine, Dept Stat, Irvine, CA 92697 USA
[2] Univ Calif Irvine, Dept Comp Sci, Irvine, CA 92697 USA
[3] Univ Toronto, Dept Stat, Toronto, ON M5S 3G3, Canada
[4] Univ Toronto, Dept Comp Sci, Toronto, ON M5S 3G3, Canada
基金
美国国家科学基金会; 加拿大自然科学与工程研究理事会;
关键词
Markov chain Monte Carlo; Hamiltonian dynamics; Bayesian analysis;
D O I
10.1007/s11222-012-9373-1
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We show how the Hamiltonian Monte Carlo algorithm can sometimes be speeded up by "splitting" the Hamiltonian in a way that allows much of the movement around the state space to be done at low computational cost. One context where this is possible is when the log density of the distribution of interest (the potential energy function) can be written as the log of a Gaussian density, which is a quadratic function, plus a slowly-varying function. Hamiltonian dynamics for quadratic energy functions can be analytically solved. With the splitting technique, only the slowly-varying part of the energy needs to be handled numerically, and this can be done with a larger stepsize (and hence fewer steps) than would be necessary with a direct simulation of the dynamics. Another context where splitting helps is when the most important terms of the potential energy function and its gradient can be evaluated quickly, with only a slowly-varying part requiring costly computations. With splitting, the quick portion can be handled with a small stepsize, while the costly portion uses a larger stepsize. We show that both of these splitting approaches can reduce the computational cost of sampling from the posterior distribution for a logistic regression model, using either a Gaussian approximation centered on the posterior mode, or a Hamiltonian split into a term that depends on only a small number of critical cases, and another term that involves the larger number of cases whose influence on the posterior distribution is small.
引用
收藏
页码:339 / 349
页数:11
相关论文
共 50 条
  • [1] Split Hamiltonian Monte Carlo
    Babak Shahbaba
    Shiwei Lan
    Wesley O. Johnson
    Radford M. Neal
    Statistics and Computing, 2014, 24 : 339 - 349
  • [2] Split Hamiltonian Monte Carlo revisited
    Casas, Fernando
    Sanz-Serna, Jesus Maria
    Shaw, Luke
    STATISTICS AND COMPUTING, 2022, 32 (05)
  • [3] Split Hamiltonian Monte Carlo revisited
    Fernando Casas
    Jesús María Sanz-Serna
    Luke Shaw
    Statistics and Computing, 2022, 32
  • [4] Monte Carlo Hamiltonian
    Jirari, H
    Kröger, H
    Luo, XQ
    Moriarty, KJM
    PHYSICS LETTERS A, 1999, 258 (01) : 6 - 14
  • [5] Monte Carlo Hamiltonian
    Jirari, H
    Kröger, H
    Huang, CQ
    Jiang, JQ
    Luo, XQ
    Moriarty, KJM
    NUCLEAR PHYSICS B-PROCEEDINGS SUPPLEMENTS, 2000, 83-4 : 953 - 955
  • [6] Monte Carlo Hamiltonian
    Jirari, H.
    Kröger, H.
    Luo, X.Q.
    Moriarty, K.J.M.
    Physics Letters, Section A: General, Atomic and Solid State Physics, 1999, 258 (01): : 6 - 14
  • [7] Magnetic Hamiltonian Monte Carlo
    Tripuraneni, Nilesh
    Rowland, Mark
    Ghahramani, Zoubin
    Turner, Richard
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [8] Nonparametric Hamiltonian Monte Carlo
    Mak, Carol
    Zaiser, Fabian
    Ong, Luke
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] Microcanonical Hamiltonian Monte Carlo
    Robnik, Jakob
    De Luca, G. Bruno
    Silverstein, Eva
    Seljak, Uros
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [10] CONSERVATIVE HAMILTONIAN MONTE CARLO
    McGregor, Geoffrey
    Wan, Andy T.S.
    arXiv, 2022,