Variational Bayesian inference with Gaussian-mixture approximations

被引:14
|
作者
Zobay, O. [1 ]
机构
[1] Univ Bristol, Dept Math, Bristol BS8 1TW, Avon, England
来源
基金
英国工程与自然科学研究理事会;
关键词
Approximation methods; variational inference; normal mixtures; Bayesian lasso; state-space models;
D O I
10.1214/14-EJS887
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Variational Bayesian inference with a Gaussian posterior approximation provides an alternative to the more commonly employed factorization approach and enlarges the range of tractable distributions. In this paper, we propose an extension to the Gaussian approach which uses Gaussian mixtures as approximations. A general problem for variational inference with mixtures is posed by the calculation of the entropy term in the Kullback-Leibler distance, which becomes analytically intractable. We deal with this problem by using a simple lower bound for the entropy and imposing restrictions on the form of the Gaussian covariance matrix. In this way, efficient numerical calculations become possible. To illustrate the method, we discuss its application to an isotropic generalized normal target density, a non-Gaussian state space model, and the Bayesian lasso. For heavy-tailed distributions, the examples show that the mixture approach indeed leads to improved approximations in the sense of a reduced Kullback-Leibler distance. From a more practical point of view, mixtures can improve estimates of posterior marginal variances. Furthermore, they provide an initial estimate of posterior skewness which is not possible with single Gaussians. We also discuss general sufficient conditions under which mixtures are guaranteed to provide improvements over single-component approximations.
引用
收藏
页码:355 / 389
页数:35
相关论文
共 50 条
  • [21] GPGPU Implementation of Variational Bayesian Gaussian Mixture Models
    Nishimoto, Hiroki
    Nakada, Takashi
    Nakashima, Yasuhiko
    [J]. 2019 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING (CANDAR 2019), 2019, : 185 - 190
  • [22] Variational bayesian feature selection for Gaussian mixture models
    Valente, F
    Wellekens, C
    [J]. 2004 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL I, PROCEEDINGS: SPEECH PROCESSING, 2004, : 513 - 516
  • [23] Variational approximations in Bayesian model selection for finite mixture distributions
    McGrory, C. A.
    Titterington, D. M.
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (11) : 5352 - 5367
  • [24] Trust-region variational inference with gaussian mixture models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. Journal of Machine Learning Research, 2020, 21
  • [25] Variational Learning and Inference Algorithms for Extended Gaussian Mixture Model
    Wei, Xin
    Chen, Jianxin
    Wang, Lei
    Cui, Jingwu
    Zheng, Baoyu
    [J]. 2014 IEEE/CIC INTERNATIONAL CONFERENCE ON COMMUNICATIONS IN CHINA (ICCC), 2014, : 236 - 240
  • [26] Trust-Region Variational Inference with Gaussian Mixture Models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [27] Stochastic Variational Inference for Bayesian Sparse Gaussian Process Regression
    Yu, Haibin
    Trong Nghia Hoang
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [28] Gaussian-Mixture based Ensemble Kalman Filter
    Govaers, Felix
    Koch, Wolfgang
    Willett, Peter
    [J]. 2015 18TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION), 2015, : 1625 - 1632
  • [29] Bayesian inference for infinite asymmetric Gaussian mixture with feature selection
    Ziyang Song
    Samr Ali
    Nizar Bouguila
    [J]. Soft Computing, 2021, 25 : 6043 - 6053
  • [30] Bayesian inference for infinite asymmetric Gaussian mixture with feature selection
    Song, Ziyang
    Ali, Samr
    Bouguila, Nizar
    [J]. SOFT COMPUTING, 2021, 25 (08) : 6043 - 6053