Variational Bayesian inference with Gaussian-mixture approximations

被引:14
|
作者
Zobay, O. [1 ]
机构
[1] Univ Bristol, Dept Math, Bristol BS8 1TW, Avon, England
来源
基金
英国工程与自然科学研究理事会;
关键词
Approximation methods; variational inference; normal mixtures; Bayesian lasso; state-space models;
D O I
10.1214/14-EJS887
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Variational Bayesian inference with a Gaussian posterior approximation provides an alternative to the more commonly employed factorization approach and enlarges the range of tractable distributions. In this paper, we propose an extension to the Gaussian approach which uses Gaussian mixtures as approximations. A general problem for variational inference with mixtures is posed by the calculation of the entropy term in the Kullback-Leibler distance, which becomes analytically intractable. We deal with this problem by using a simple lower bound for the entropy and imposing restrictions on the form of the Gaussian covariance matrix. In this way, efficient numerical calculations become possible. To illustrate the method, we discuss its application to an isotropic generalized normal target density, a non-Gaussian state space model, and the Bayesian lasso. For heavy-tailed distributions, the examples show that the mixture approach indeed leads to improved approximations in the sense of a reduced Kullback-Leibler distance. From a more practical point of view, mixtures can improve estimates of posterior marginal variances. Furthermore, they provide an initial estimate of posterior skewness which is not possible with single Gaussians. We also discuss general sufficient conditions under which mixtures are guaranteed to provide improvements over single-component approximations.
引用
收藏
页码:355 / 389
页数:35
相关论文
共 50 条
  • [1] Community Embeddings with Bayesian Gaussian Mixture Model and Variational Inference
    Begehr, Anton I. N.
    Panfilov, Peter B.
    [J]. 2022 IEEE 24TH CONFERENCE ON BUSINESS INFORMATICS (CBI 2022), VOL 2, 2022, : 88 - 96
  • [2] Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference
    Chen, Peng
    Zabaras, Nicholas
    Bilionis, Ilias
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2015, 284 : 291 - 333
  • [3] Design of Bayesian Signal Detectors using Gaussian-Mixture Models
    Jilkov, Vesselin P.
    Katkuri, Jaipal R.
    Nandiraju, Hari K.
    [J]. 2010 42ND SOUTHEASTERN SYMPOSIUM ON SYSTEM THEORY (SSST), 2010,
  • [4] An Introduction to Bayesian Inference via Variational Approximations
    Grimmer, Justin
    [J]. POLITICAL ANALYSIS, 2011, 19 (01) : 32 - 47
  • [5] Sharp Guarantees and Optimal Performance for Inference in Binary and Gaussian-Mixture Models
    Taheri, Hossein
    Pedarsani, Ramtin
    Thrampoulidis, Christos
    [J]. ENTROPY, 2021, 23 (02) : 1 - 46
  • [6] Gaussian-Mixture Umbrella Sampling
    Maragakis, Paul
    van der Vaart, Arjan
    Karplus, Martin
    [J]. JOURNAL OF PHYSICAL CHEMISTRY B, 2009, 113 (14): : 4664 - 4673
  • [7] Channel Estimation for Massive MIMO Using Gaussian-Mixture Bayesian Learning
    Wen, Chao-Kai
    Jin, Shi
    Wong, Kai-Kit
    Chen, Jung-Chieh
    Ting, Pangan
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2015, 14 (03) : 1356 - 1368
  • [8] Channel Estimation for Massive MIMO Using Gaussian-Mixture Bayesian Learning
    Institute of Communications Engineering, National Sun Yat-sen University, Kaohsiung
    804, Taiwan
    不详
    210096, China
    不详
    WC1E 7JE, United Kingdom
    不详
    802, Taiwan
    不详
    310, Taiwan
    [J]. IEEE Trans. Wireless Commun., 3 (1356-1368):
  • [9] Variational Inference of Finite Asymmetric Gaussian Mixture Models
    Song, Ziyang
    Bregu, Ornela
    Ali, Samr
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2448 - 2454
  • [10] A Gaussian-mixture ensemble transform filter
    Reich, Sebastian
    [J]. QUARTERLY JOURNAL OF THE ROYAL METEOROLOGICAL SOCIETY, 2012, 138 (662) : 222 - 233