Stochastic Gradient Richardson-Romberg Markov Chain Monte Carlo

被引:0
|
作者
Durmus, Alain [1 ]
Simsekli, Umut [1 ]
Moulines, Eric [2 ]
Badeau, Roland [1 ]
Richard, Gael [1 ]
机构
[1] Univ Paris Saclay, Telecom ParisTech, CNRS, LTCI, F-75013 Paris, France
[2] Ecole Polytech, UMR 7641, Ctr Math Appl, Palaiseau, France
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Stochastic Gradient Markov Chain Monte Carlo (SG-MCMC) algorithms have become increasingly popular for Bayesian inference in large-scale applications. Even though these methods have proved useful in several scenarios, their performance is often limited by their bias. In this study, we propose a novel sampling algorithm that aims to reduce the bias of SG-MCMC while keeping the variance at a reasonable level. Our approach is based on a numerical sequence acceleration method, namely the Richardson-Romberg extrapolation, which simply boils down to running almost the same SG-MCMC algorithm twice in parallel with different step sizes. We illustrate our framework on the popular Stochastic Gradient Langevin Dynamics (SGLD) algorithm and propose a novel SG-MCMC algorithm referred to as Stochastic Gradient Richardson-Romberg Langevin Dynamics (SGRRLD). We provide formal theoretical analysis and show that SGRRLD is asymptotically consistent, satisfies a central limit theorem, and its non-asymptotic bias and the mean squared-error can be bounded. Our results show that SGRRLD attains higher rates of convergence than SGLD in both finite-time and asymptotically, and it achieves the theoretical accuracy of the methods that are based on higher-order integrators. We support our findings using both synthetic and real data experiments.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Stochastic Gradient Markov Chain Monte Carlo
    Nemeth, Christopher
    Fearnhead, Paul
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2021, 116 (533) : 433 - 450
  • [2] Multilevel Richardson-Romberg extrapolation
    Lemaire, Vincent
    Pages, Gilles
    [J]. BERNOULLI, 2017, 23 (4A) : 2643 - 2692
  • [3] LAPLACIAN SMOOTHING STOCHASTIC GRADIENT MARKOV CHAIN MONTE CARLO
    Wang, Bao
    Zou, Difan
    Gu, Quanquan
    Osher, Stanley J.
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2021, 43 (01): : A26 - A53
  • [4] Image Registration via Stochastic Gradient Markov Chain Monte Carlo
    Grzech, Daniel
    Kainz, Bernhard
    Glocker, Ben
    Le Folgoc, Loic
    [J]. UNCERTAINTY FOR SAFE UTILIZATION OF MACHINE LEARNING IN MEDICAL IMAGING, AND GRAPHS IN BIOMEDICAL IMAGE ANALYSIS, UNSURE 2020, GRAIL 2020, 2020, 12443 : 3 - 12
  • [5] sgmcmc: An R Package for Stochastic Gradient Markov Chain Monte Carlo
    Baker, Jack
    Fearnhead, Paul
    Fox, Emily B.
    Nemeth, Christopher
    [J]. JOURNAL OF STATISTICAL SOFTWARE, 2019, 91 (03): : 1 - 27
  • [6] A multi-step Richardson-Romberg extrapolation method for stochastic approximation
    Frikha, N.
    Huang, L.
    [J]. STOCHASTIC PROCESSES AND THEIR APPLICATIONS, 2015, 125 (11) : 4066 - 4101
  • [7] Monte Carlo Tennis: A Stochastic Markov Chain Model
    Newton, Paul K.
    Aslam, Kamran
    [J]. JOURNAL OF QUANTITATIVE ANALYSIS IN SPORTS, 2009, 5 (03)
  • [8] A Gradient-Based Blocking Markov Chain Monte Carlo Method for Stochastic Inverse Modeling
    Fu, Jianlin
    Gomez-Hernandez, J. Jaime
    Du, Song
    [J]. GEOSTATISTICS VALENCIA 2016, 2017, 19 : 777 - 788
  • [9] Indirect gradient analysis by Markov-chain Monte Carlo
    Walker, Steven C.
    [J]. PLANT ECOLOGY, 2015, 216 (05) : 697 - 708
  • [10] Gradient-based Adaptive Markov Chain Monte Carlo
    Titsias, Michalis K.
    Dellaportas, Petros
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32