On the Computational Complexity of Metropolis-Adjusted Langevin Algorithms for Bayesian Posterior Sampling

被引:0
|
作者
Tang, Rong [1 ]
Yang, Yun [2 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Math, Hong Kong, Peoples R China
[2] Univ Illinois, Dept Stat, Champaign, IL USA
关键词
Bayesian inference; Gibbs posterior; Large sample theory; Log-isoperimetric; inequality; Metropolis -adjusted Langevin algorithms; Mixing time; MONTE-CARLO; CONVERGENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we examine the computational complexity of sampling from a Bayesian posterior (or pseudo-posterior) using the Metropolis-adjusted Langevin algorithm (MALA). MALA first employs a discrete-time Langevin SDE to propose a new state, and then adjusts the proposed state using Metropolis-Hastings rejection. Most existing theoretical analyses of MALA rely on the smoothness and strong log-concavity properties of the target distribution, which are often lacking in practical Bayesian problems. Our analysis hinges on statistical large sample theory, which constrains the deviation of the Bayesian posterior from being smooth and log-concave in a very specific way. In particular, we introduce a new technique for bounding the mixing time of a Markov chain with a continuous state space via the s-conductance profile, offering improvements over existing techniques in several aspects. By employing this new technique, we establish the optimal parameter dimension dependence of d(1/3 )and condition number dependence of kappa in the non-asymptotic mixing time upper bound for MALA after the burn-in period, under a standard Bayesian setting where the target posterior distribution is close to a d-dimensional Gaussian distribution with a covariance matrix having a condition number kappa. We also prove a matching mixing time lower bound for sampling from a multivariate Gaussian via MALA to complement the upper bound.
引用
收藏
页码:1 / 79
页数:79
相关论文
共 14 条
  • [1] Particle Metropolis-adjusted Langevin algorithms
    Nemeth, Christopher
    Sherlock, Chris
    Fearnhead, Paul
    [J]. BIOMETRIKA, 2016, 103 (03) : 701 - 717
  • [2] Nonconvex sampling with the Metropolis-adjusted Langevin algorithm
    Mangoubi, Oren
    Vishnoi, Nisheeth K.
    [J]. CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [3] GAUSSIAN APPROXIMATIONS OF SDES IN METROPOLIS-ADJUSTED LANGEVIN ALGORITHMS
    Sarkka, Simo
    Merkatas, Christos
    Karvonen, Toni
    [J]. 2021 IEEE 31ST INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2021,
  • [4] Langevin diffusions and the Metropolis-adjusted Langevin algorithm
    Xifara, T.
    Sherlock, C.
    Livingstone, S.
    Byrne, S.
    Girolami, M.
    [J]. STATISTICS & PROBABILITY LETTERS, 2014, 91 : 14 - 19
  • [5] Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling
    Wu, Keru
    Schmidler, Scott
    Chen, Yuansi
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23 : 1 - 63
  • [6] Minimax Mixing Time of the Metropolis-Adjusted Langevin Algorithm for Log-Concave Sampling
    Wu, Keru
    Schmidler, Scott
    Chen, Yuansi
    [J]. Journal of Machine Learning Research, 2022, 23
  • [7] On geometric convergence for the Metropolis-adjusted Langevin algorithm under simple conditions
    Oliviero-Durmus, Alain
    Moulines, Eric
    [J]. BIOMETRIKA, 2024, 111 (01) : 273 - 289
  • [8] Decentralized Bayesian learning with Metropolis-adjusted Hamiltonian Monte Carlo
    Vyacheslav Kungurtsev
    Adam Cobb
    Tara Javidi
    Brian Jalaian
    [J]. Machine Learning, 2023, 112 : 2791 - 2819
  • [9] Decentralized Bayesian learning with Metropolis-adjusted Hamiltonian Monte Carlo
    Kungurtsev, Vyacheslav
    Cobb, Adam
    Javidi, Tara
    Jalaian, Brian
    [J]. MACHINE LEARNING, 2023, 112 (08) : 2791 - 2819
  • [10] A Shrinkage-Thresholding Metropolis Adjusted Langevin Algorithm for Bayesian Variable Selection
    Schreck, Amandine
    Fort, Gersende
    Le Corff, Sylvain
    Moulines, Eric
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2016, 10 (02) : 366 - 375