Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference

被引:1
|
作者
Zhize Li
Tianyi Zhang
Shuyu Cheng
Jun Zhu
Jian Li
机构
[1] Tsinghua University,
来源
Machine Learning | 2019年 / 108卷
关键词
Hamiltonian Monte Carlo; Variance reduction; Bayesian inference;
D O I
暂无
中图分类号
学科分类号
摘要
Gradient-based Monte Carlo sampling algorithms, like Langevin dynamics and Hamiltonian Monte Carlo, are important methods for Bayesian inference. In large-scale settings, full-gradients are not affordable and thus stochastic gradients evaluated on mini-batches are used as a replacement. In order to reduce the high variance of noisy stochastic gradients, Dubey et al. (in: Advances in neural information processing systems, pp 1154–1162, 2016) applied the standard variance reduction technique on stochastic gradient Langevin dynamics and obtained both theoretical and experimental improvements. In this paper, we apply the variance reduction tricks on Hamiltonian Monte Carlo and achieve better theoretical convergence results compared with the variance-reduced Langevin dynamics. Moreover, we apply the symmetric splitting scheme in our variance-reduced Hamiltonian Monte Carlo algorithms to further improve the theoretical results. The experimental results are also consistent with the theoretical results. As our experiment shows, variance-reduced Hamiltonian Monte Carlo demonstrates better performance than variance-reduced Langevin dynamics in Bayesian regression and classification tasks on real-world datasets.
引用
下载
收藏
页码:1701 / 1727
页数:26
相关论文
共 50 条
  • [21] On Monte Carlo methods for Bayesian inference
    Qian, SS
    Stow, CA
    Borsuk, ME
    ECOLOGICAL MODELLING, 2003, 159 (2-3) : 269 - 277
  • [22] A Hamiltonian Monte-Carlo method for Bayesian inference of supermassive black hole binaries
    Porter, Edward K.
    Carre, Jerome
    CLASSICAL AND QUANTUM GRAVITY, 2014, 31 (14)
  • [23] Bayesian inference for binary neutron star inspirals using a Hamiltonian Monte Carlo algorithm
    Bouffanais, Yann
    Porter, Edward K.
    PHYSICAL REVIEW D, 2019, 100 (10)
  • [24] Neural network gradient Hamiltonian Monte Carlo
    Lingge Li
    Andrew Holbrook
    Babak Shahbaba
    Pierre Baldi
    Computational Statistics, 2019, 34 : 281 - 299
  • [25] Neural network gradient Hamiltonian Monte Carlo
    Li, Lingge
    Holbrook, Andrew
    Shahbaba, Babak
    Baldi, Pierre
    COMPUTATIONAL STATISTICS, 2019, 34 (01) : 281 - 299
  • [26] Hamiltonian Monte Carlo method for estimating variance components
    Arakawa, Aisaku
    Hayashi, Takeshi
    Taniguchi, Masaaki
    Mikawa, Satoshi
    Nishio, Motohide
    ANIMAL SCIENCE JOURNAL, 2021, 92 (01)
  • [27] Bayesian Inference for Mixed Gaussian GARCH-Type Model by Hamiltonian Monte Carlo Algorithm
    Rubing Liang
    Binbin Qin
    Qiang Xia
    Computational Economics, 2024, 63 : 193 - 220
  • [28] Monte Carlo Hamiltonian from stochastic basis
    Huang, CQ
    Kröger, H
    Luo, XQ
    Moriarty, KJM
    PHYSICS LETTERS A, 2002, 299 (5-6) : 483 - 493
  • [29] On the Convergence of Hamiltonian Monte Carlo with Stochastic Gradients
    Zou, Difan
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [30] Bayesian Inference for Mixed Gaussian GARCH-Type Model by Hamiltonian Monte Carlo Algorithm
    Liang, Rubing
    Qin, Binbin
    Xia, Qiang
    COMPUTATIONAL ECONOMICS, 2024, 63 (01) : 193 - 220