Extended Variational Message Passing for Automated Approximate Bayesian Inference

被引:5
|
作者
Akbayrak, Semih [1 ]
Bocharov, Ivan [1 ]
de Vries, Bert [1 ,2 ]
机构
[1] Eindhoven Univ Technol, Dept Elect Engn, POB 513, NL-5600 MB Eindhoven, Netherlands
[2] GN Hearing BV, JF Kennedylaan 2, NL-5612 AB Eindhoven, Netherlands
关键词
Bayesian inference; variational inference; factor graphs; variational message passing; probabilistic programming; FACTOR GRAPH APPROACH;
D O I
10.3390/e23070815
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package ForneyLab.jl and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models.
引用
收藏
页数:36
相关论文
共 50 条
  • [31] Convolutional Approximate Message-Passing
    Takeuchi, Keigo
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 416 - 420
  • [32] Weighted and Reweighted Approximate Message Passing
    Ghadermarzy, Navid
    Yilmaz, Oezguer
    [J]. WAVELETS AND SPARSITY XV, 2013, 8858
  • [33] BAYESIAN OPTIMAL COMPRESSED SENSING WITHOUT PRIORS: PARAMETRIC SURE APPROXIMATE MESSAGE PASSING
    Guo, Chunli
    Davies, Mike E.
    [J]. 2014 PROCEEDINGS OF THE 22ND EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2014, : 1347 - 1351
  • [34] Bayesian Deep Learning via Expectation Maximization and Turbo Deep Approximate Message Passing
    Xu, Wei
    Liu, An
    Zhang, Yiting
    Lau, Vincent
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 3865 - 3878
  • [35] A Unifying Tutorial on Approximate Message Passing
    Feng, Oliver Y.
    Venkataramanan, Ramji
    Rush, Cynthia
    Samworth, Richard J.
    [J]. FOUNDATIONS AND TRENDS IN MACHINE LEARNING, 2022, 15 (04): : 335 - 536
  • [36] UPSCALING VECTOR APPROXIMATE MESSAGE PASSING
    Skuratovs, Nikolajs
    Davies, Michael
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4757 - 4761
  • [37] Universality of approximate message passing algorithms
    Chen, Wei-Kuo
    Lam, Wai-Kit
    [J]. ELECTRONIC JOURNAL OF PROBABILITY, 2021, 26
  • [38] On variational message passing on factor graphs
    Dauwels, Justin
    [J]. 2007 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-7, 2007, : 2546 - 2550
  • [39] Multitarget Tracking Based on Dynamic Bayesian Network With Reparameterized Approximate Variational Inference
    Zhang, Wenqiong
    Zhang, Jun
    Bao, Ming
    Zhang, Xiao-Ping
    Li, Xiaodong
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (13) : 11542 - 11559
  • [40] Mean-field variational approximate Bayesian inference for latent variable models
    Consonni, Guido
    Marin, Jean-Michel
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 52 (02) : 790 - 798