Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks

被引:8
|
作者
Ramos-Lopez, Dario [1 ]
Masegosa, Andres R. [1 ]
Salmeron, Antonio [1 ]
Rumi, Rafael [1 ]
Langseth, Helge [3 ]
Nielsen, Thomas D. [1 ,2 ]
Madsen, Anders L. [2 ,4 ]
机构
[1] Univ Almeria, Dept Math, Almeria, Spain
[2] Aalborg Univ, Dept Comp Sci, Aalborg, Denmark
[3] Norwegian Univ Sci & Technol, Dept Comp & Informat Sci, Trondheim, Norway
[4] HUGIN EXPERT AS, Aalborg, Denmark
关键词
Importance sampling; Bayesian networks; Conditional linear Gaussian models; Scalable inference; Gaussian mixtures; INFERENCE; PROPAGATION; COMPUTATION; ALGORITHM;
D O I
10.1016/j.ijar.2018.06.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we propose a scalable importance sampling algorithm for computing Gaussian mixture posteriors in conditional linear Gaussian Bayesian networks. Our contribution is based on using a stochastic gradient ascent procedure taking as input a stream of importance sampling weights, so that a mixture of Gaussians is dynamically updated with no need to store the full sample. The algorithm has been designed following a Map/Reduce approach and is therefore scalable with respect to computing resources. The implementation of the proposed algorithm is available as part of the AMIDST open-source toolbox for scalable probabilistic machine learning. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:115 / 134
页数:20
相关论文
共 50 条
  • [1] Improved marginal likelihood estimation via power posteriors and importance sampling
    Li, Yong
    Wang, Nianling
    Yu, Jun
    [J]. JOURNAL OF ECONOMETRICS, 2023, 234 (01) : 28 - 52
  • [2] On the use of marginal posteriors in marginal likelihood estimation via importance sampling
    Perrakis, Konstantinos
    Ntzoufras, Ioannis
    Tsionas, Efthymios G.
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 77 : 54 - 69
  • [3] Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks
    Sun, W
    Chang, KC
    [J]. Signal Processing, Sensor Fusion, and Target Recognition XIV, 2005, 5809 : 322 - 329
  • [4] Free energy methods for Bayesian inference: efficient exploration of univariate Gaussian mixture posteriors
    Chopin, Nicolas
    Lelievre, Tony
    Stoltz, Gabriel
    [J]. STATISTICS AND COMPUTING, 2012, 22 (04) : 897 - 916
  • [5] Free energy methods for Bayesian inference: efficient exploration of univariate Gaussian mixture posteriors
    Nicolas Chopin
    Tony Lelièvre
    Gabriel Stoltz
    [J]. Statistics and Computing, 2012, 22 : 897 - 916
  • [6] Bayesian estimation of the Gaussian mixture GARCH model
    Concepcion Ausin, Maria
    Galeano, Pedro
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2007, 51 (05) : 2636 - 2652
  • [7] Bayesian Optimized Mixture Importance Sampling for High-Sigma Failure Rate Estimation
    Weller, Dennis D.
    Hefenbrock, Michael
    Golanbari, Mohammad S.
    Beigl, Michael
    Aghassi-Hagmann, Jasmin
    Tahoori, Mehdi B.
    [J]. IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2020, 39 (10) : 2772 - 2783
  • [8] Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset
    Soize, Christian
    Ghanem, Roger G.
    Desceliers, Christophe
    [J]. STATISTICS AND COMPUTING, 2020, 30 (05) : 1433 - 1457
  • [9] MIXED BAYESIAN NETWORKS - A MIXTURE OF GAUSSIAN DISTRIBUTIONS
    CHEVROLAT, JP
    RUTIGLIANO, F
    GOLMARD, JL
    [J]. METHODS OF INFORMATION IN MEDICINE, 1994, 33 (05) : 535 - 542
  • [10] Sampling of Bayesian posteriors with a non-Gaussian probabilistic learning on manifolds from a small dataset
    Christian Soize
    Roger G. Ghanem
    Christophe Desceliers
    [J]. Statistics and Computing, 2020, 30 : 1433 - 1457