Split Hamiltonian Monte Carlo

被引:0
|
作者
Babak Shahbaba
Shiwei Lan
Wesley O. Johnson
Radford M. Neal
机构
[1] University of California,Department of Statistics and Department of Computer Science
[2] University of California,Department of Statistics
[3] University of Toronto,Department of Statistics and Department of Computer Science
来源
Statistics and Computing | 2014年 / 24卷
关键词
Markov chain Monte Carlo; Hamiltonian dynamics; Bayesian analysis;
D O I
暂无
中图分类号
学科分类号
摘要
We show how the Hamiltonian Monte Carlo algorithm can sometimes be speeded up by “splitting” the Hamiltonian in a way that allows much of the movement around the state space to be done at low computational cost. One context where this is possible is when the log density of the distribution of interest (the potential energy function) can be written as the log of a Gaussian density, which is a quadratic function, plus a slowly-varying function. Hamiltonian dynamics for quadratic energy functions can be analytically solved. With the splitting technique, only the slowly-varying part of the energy needs to be handled numerically, and this can be done with a larger stepsize (and hence fewer steps) than would be necessary with a direct simulation of the dynamics. Another context where splitting helps is when the most important terms of the potential energy function and its gradient can be evaluated quickly, with only a slowly-varying part requiring costly computations. With splitting, the quick portion can be handled with a small stepsize, while the costly portion uses a larger stepsize. We show that both of these splitting approaches can reduce the computational cost of sampling from the posterior distribution for a logistic regression model, using either a Gaussian approximation centered on the posterior mode, or a Hamiltonian split into a term that depends on only a small number of critical cases, and another term that involves the larger number of cases whose influence on the posterior distribution is small.
引用
收藏
页码:339 / 349
页数:10
相关论文
共 50 条
  • [1] Split Hamiltonian Monte Carlo
    Shahbaba, Babak
    Lan, Shiwei
    Johnson, Wesley O.
    Neal, Radford M.
    [J]. STATISTICS AND COMPUTING, 2014, 24 (03) : 339 - 349
  • [2] Split Hamiltonian Monte Carlo revisited
    Casas, Fernando
    Sanz-Serna, Jesus Maria
    Shaw, Luke
    [J]. STATISTICS AND COMPUTING, 2022, 32 (05)
  • [3] Split Hamiltonian Monte Carlo revisited
    Fernando Casas
    Jesús María Sanz-Serna
    Luke Shaw
    [J]. Statistics and Computing, 2022, 32
  • [4] Monte Carlo Hamiltonian
    Jirari, H
    Kröger, H
    Luo, XQ
    Moriarty, KJM
    [J]. PHYSICS LETTERS A, 1999, 258 (01) : 6 - 14
  • [5] Monte Carlo Hamiltonian
    Jirari, H
    Kröger, H
    Huang, CQ
    Jiang, JQ
    Luo, XQ
    Moriarty, KJM
    [J]. NUCLEAR PHYSICS B-PROCEEDINGS SUPPLEMENTS, 2000, 83-4 : 953 - 955
  • [6] Magnetic Hamiltonian Monte Carlo
    Tripuraneni, Nilesh
    Rowland, Mark
    Ghahramani, Zoubin
    Turner, Richard
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [7] Nonparametric Hamiltonian Monte Carlo
    Mak, Carol
    Zaiser, Fabian
    Ong, Luke
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [8] Microcanonical Hamiltonian Monte Carlo
    Robnik, Jakob
    De Luca, G. Bruno
    Silverstein, Eva
    Seljak, Uros
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [9] Wormhole Hamiltonian Monte Carlo
    Lan, Shiwei
    Streets, Jeffrey
    Shahbaba, Babak
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 1953 - 1959
  • [10] RANDOMIZED HAMILTONIAN MONTE CARLO
    Bou-Rabee, Nawaf
    Maria Sanz-Serna, Jesus
    [J]. ANNALS OF APPLIED PROBABILITY, 2017, 27 (04): : 2159 - 2194