Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference

被引:1
|
作者
Zhize Li
Tianyi Zhang
Shuyu Cheng
Jun Zhu
Jian Li
机构
[1] Tsinghua University,
来源
Machine Learning | 2019年 / 108卷
关键词
Hamiltonian Monte Carlo; Variance reduction; Bayesian inference;
D O I
暂无
中图分类号
学科分类号
摘要
Gradient-based Monte Carlo sampling algorithms, like Langevin dynamics and Hamiltonian Monte Carlo, are important methods for Bayesian inference. In large-scale settings, full-gradients are not affordable and thus stochastic gradients evaluated on mini-batches are used as a replacement. In order to reduce the high variance of noisy stochastic gradients, Dubey et al. (in: Advances in neural information processing systems, pp 1154–1162, 2016) applied the standard variance reduction technique on stochastic gradient Langevin dynamics and obtained both theoretical and experimental improvements. In this paper, we apply the variance reduction tricks on Hamiltonian Monte Carlo and achieve better theoretical convergence results compared with the variance-reduced Langevin dynamics. Moreover, we apply the symmetric splitting scheme in our variance-reduced Hamiltonian Monte Carlo algorithms to further improve the theoretical results. The experimental results are also consistent with the theoretical results. As our experiment shows, variance-reduced Hamiltonian Monte Carlo demonstrates better performance than variance-reduced Langevin dynamics in Bayesian regression and classification tasks on real-world datasets.
引用
收藏
页码:1701 / 1727
页数:26
相关论文
共 50 条
  • [1] Stochastic gradient Hamiltonian Monte Carlo with variance reduction for Bayesian inference
    Li, Zhize
    Zhang, Tianyi
    Cheng, Shuyu
    Zhu, Jun
    Li, Jian
    [J]. MACHINE LEARNING, 2019, 108 (8-9) : 1701 - 1727
  • [2] Stochastic Gradient Hamiltonian Monte Carlo Methods with Recursive Variance Reduction
    Zou, Difan
    Xu, Pan
    Gu, Quanquan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [3] On the Theory of Variance Reduction for Stochastic Gradient Monte Carlo
    Chatterji, Niladri S.
    Flammarion, Nicolas
    Ma, Yi-An
    Bartlett, Peter L.
    Jordan, Michael I.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [4] Stochastic Gradient Hamiltonian Monte Carlo
    Chen, Tianqi
    Fox, Emily B.
    Guestrin, Carlos
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 1683 - 1691
  • [5] Modified Hamiltonian Monte Carlo for Bayesian inference
    Radivojevic, Tijana
    Akhmatskaya, Elena
    [J]. STATISTICS AND COMPUTING, 2020, 30 (02) : 377 - 404
  • [6] Modified Hamiltonian Monte Carlo for Bayesian inference
    Tijana Radivojević
    Elena Akhmatskaya
    [J]. Statistics and Computing, 2020, 30 : 377 - 404
  • [7] Inference in Deep Gaussian Processes using Stochastic Gradient Hamiltonian Monte Carlo
    Havasi, Marton
    Hernandez-Lobato, Jose Miguel
    Jose Murillo-Fuentes, Juan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [8] A Hybrid Stochastic Gradient Hamiltonian Monte Carlo Method
    Zhang, Chao
    Li, Zhijian
    Shen, Zebang
    Xie, Jiahao
    Qian, Hui
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10842 - 10850
  • [9] Bayesian inference for a single factor copula stochastic volatility model using Hamiltonian Monte Carlo
    Kreuzer, Alexander
    Czado, Claudia
    [J]. ECONOMETRICS AND STATISTICS, 2021, 19 : 130 - 150
  • [10] Bayesian parameter inference in hydrological modelling using a Hamiltonian Monte Carlo approach with a stochastic rain model
    Ulzega, Simone
    Albert, Carlo
    [J]. HYDROLOGY AND EARTH SYSTEM SCIENCES, 2023, 27 (15) : 2935 - 2950