Extended Variational Message Passing for Automated Approximate Bayesian Inference

被引:5
|
作者
Akbayrak, Semih [1 ]
Bocharov, Ivan [1 ]
de Vries, Bert [1 ,2 ]
机构
[1] Eindhoven Univ Technol, Dept Elect Engn, POB 513, NL-5600 MB Eindhoven, Netherlands
[2] GN Hearing BV, JF Kennedylaan 2, NL-5612 AB Eindhoven, Netherlands
关键词
Bayesian inference; variational inference; factor graphs; variational message passing; probabilistic programming; FACTOR GRAPH APPROACH;
D O I
10.3390/e23070815
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package ForneyLab.jl and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models.
引用
收藏
页数:36
相关论文
共 50 条
  • [1] Approximate message-passing inference algorithm
    Jung, Kyomin
    Shah, Devavrat
    [J]. 2007 IEEE INFORMATION THEORY WORKSHOP, VOLS 1 AND 2, 2007, : 224 - +
  • [2] Reactive Message Passing for Scalable Bayesian Inference
    Bagaev, Dmitry
    De Vries, Bert
    [J]. Scientific Programming, 2023, 2023
  • [3] Variational Bayesian and Generalized Approximate Message Passing-Based Sparse Bayesian Learning Model for Image Reconstruction
    Dong, Jingyi
    Lyu, Wentao
    Zhou, Di
    Xu, Weiqiang
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 2328 - 2332
  • [4] Streaming Bayesian inference: theoretical limits and mini-batch approximate message-passing
    Manoel, Andre
    Krzakala, Florent
    Tramel, Eric W.
    Zdeborova, Lenka
    [J]. 2017 55TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2017, : 1048 - 1055
  • [5] Unitary Approximate Message Passing for Sparse Bayesian Learning
    Luo, Man
    Guo, Qinghua
    Jin, Ming
    Eldar, Yonina C.
    Huang, Defeng
    Meng, Xiangming
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 6023 - 6038
  • [6] Sparse Bayesian Learning Using Approximate Message Passing
    Al-Shoukairi, Maher
    Rao, Bhaskar
    [J]. CONFERENCE RECORD OF THE 2014 FORTY-EIGHTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2014, : 1957 - 1961
  • [7] Analysis of random sequential message passing algorithms for approximate inference
    Cakmak, Burak
    Lu, Yue M.
    Opper, Manfred
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (07):
  • [8] Variational message passing
    Winn, J
    Bishop, CM
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2005, 6 : 661 - 694
  • [9] Tree-AMP: Compositional Inference with Tree Approximate Message Passing
    Baker, Antoine
    Krzakala, Florent
    Aubin, Benjamin
    Zdeborova, Lenka
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [10] Graphical stochastic models for tracking applications with variational message passing inference
    Trusheim, Felix
    Condurache, Alexandru
    Mertins, Alfred
    [J]. 2016 SIXTH INTERNATIONAL CONFERENCE ON IMAGE PROCESSING THEORY, TOOLS AND APPLICATIONS (IPTA), 2016,