Variational Consensus Monte Carlo

被引:0
|
作者
Rabinovich, Maxim [1 ]
Angelino, Elaine [1 ]
Jordan, Michael I. [1 ]
机构
[1] Univ Calif Berkeley, Div Comp Sci, Berkeley, CA 94720 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Practitioners of Bayesian statistics have long depended on Markov chain Monte Carlo (MCMC) to obtain samples from intractable posterior distributions. Unfortunately, MCMC algorithms are typically serial, and do not scale to the large datasets typical of modern machine learning. The recently proposed consensus Monte Carlo algorithm removes this limitation by partitioning the data and drawing samples conditional on each partition in parallel [22]. A fixed aggregation function then combines these samples, yielding approximate posterior samples. We introduce variational consensus Monte Carlo (VCMC), a variational Bayes algorithm that optimizes over aggregation functions to obtain samples from a distribution that better approximates the target. The resulting objective contains an intractable entropy term; we therefore derive a relaxation of the objective and show that the relaxed problem is blockwise concave under mild conditions. We illustrate the advantages of our algorithm on three inference tasks from the literature, demonstrating both the superior quality of the posterior approximation and the moderate overhead of the optimization step. Our algorithm achieves a relative error reduction (measured against serial MCMC) of up to 39% compared to consensus Monte Carlo on the task of estimating 300-dimensional probit regression parameter expectations; similarly, it achieves an error reduction of 92% on the task of estimating cluster comembership probabilities in a Gaussian mixture model with 8 components in 8 dimensions. Furthermore, these gains come at moderate cost compared to the runtime of serial MCMC-achieving near-ideal speedup in some instances.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Consensus Variational and Monte Carlo Algorithms for Bayesian Nonparametric Clustering
    Ni, Yang
    Jones, David
    Wang, Zeya
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 204 - 209
  • [2] Variational Sequential Monte Carlo
    Naesseth, Christian A.
    Linderman, Scott W.
    Ranganath, Rajesh
    Blei, David M.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [3] Variational Monte Carlo in solids
    Fahy, S
    [J]. QUANTUM MONTE CARLO METHODS IN PHYSICS AND CHEMISTRY, 1999, 525 : 101 - 127
  • [4] Streaming Variational Monte Carlo
    Zhao, Yuan
    Nassar, Josue
    Jordan, Ian
    Bugallo, Monica
    Park, Il Memming
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (01) : 1150 - 1161
  • [5] Variational Bayesian Monte Carlo
    Acerbi, Luigi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Global Consensus Monte Carlo
    Rendell, Lewis J.
    Johansen, Adam M.
    Lee, Anthony
    Whiteley, Nick
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2021, 30 (02) : 249 - 259
  • [7] Delayed rejection variational Monte Carlo
    Bressanini, D
    Morosi, G
    Tarasco, S
    Mira, A
    [J]. JOURNAL OF CHEMICAL PHYSICS, 2004, 121 (08): : 3446 - 3451
  • [8] Variational Bayes on Monte Carlo Steroids
    Grover, Aditya
    Ermon, Stefano
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [9] Variational Monte Carlo Treatment of Molecules
    Huang Hongxin
    Zhong Ziyi
    Cao Zexing
    [J]. ACTA PHYSICO-CHIMICA SINICA, 1997, 13 (08) : 706 - 711
  • [10] Variational Inference for Monte Carlo Objectives
    Mnih, Andriy
    Rezende, Danilo J.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48