Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks

被引:8
|
作者
Ramos-Lopez, Dario [1 ]
Masegosa, Andres R. [1 ]
Salmeron, Antonio [1 ]
Rumi, Rafael [1 ]
Langseth, Helge [3 ]
Nielsen, Thomas D. [1 ,2 ]
Madsen, Anders L. [2 ,4 ]
机构
[1] Univ Almeria, Dept Math, Almeria, Spain
[2] Aalborg Univ, Dept Comp Sci, Aalborg, Denmark
[3] Norwegian Univ Sci & Technol, Dept Comp & Informat Sci, Trondheim, Norway
[4] HUGIN EXPERT AS, Aalborg, Denmark
关键词
Importance sampling; Bayesian networks; Conditional linear Gaussian models; Scalable inference; Gaussian mixtures; INFERENCE; PROPAGATION; COMPUTATION; ALGORITHM;
D O I
10.1016/j.ijar.2018.06.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we propose a scalable importance sampling algorithm for computing Gaussian mixture posteriors in conditional linear Gaussian Bayesian networks. Our contribution is based on using a stochastic gradient ascent procedure taking as input a stream of importance sampling weights, so that a mixture of Gaussians is dynamically updated with no need to store the full sample. The algorithm has been designed following a Map/Reduce approach and is therefore scalable with respect to computing resources. The implementation of the proposed algorithm is available as part of the AMIDST open-source toolbox for scalable probabilistic machine learning. (C) 2018 Elsevier Inc. All rights reserved.
引用
收藏
页码:115 / 134
页数:20
相关论文
共 50 条
  • [31] Gaussian Mixture Models for Affordance Learning using Bayesian Networks
    Osorio, Pedro
    Bernardino, Alexandre
    Martinez-Cantin, Ruben
    Santos-Victor, Jose
    [J]. IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010,
  • [32] Density estimation with Gaussian processes for gravitational wave posteriors
    D'Emilio, V
    Green, R.
    Raymond, V
    [J]. MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2021, 508 (02) : 2090 - 2097
  • [33] Sworn testimony of the model evidence: Gaussian Mixture Importance (GAME) sampling
    Volpi, Elena
    Schoups, Gerrit
    Firmani, Giovanni
    Vrugt, Jasper A.
    [J]. WATER RESOURCES RESEARCH, 2017, 53 (07) : 6133 - 6158
  • [34] Concave Penalized Estimation of Sparse Gaussian Bayesian Networks
    Aragam, Bryon
    Zhou, Qing
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2015, 16 : 2273 - 2328
  • [35] Implicit parameter estimation for conditional Gaussian Bayesian networks
    Aida Jarraya
    Philippe Leray
    Afif Masmoudi
    [J]. International Journal of Computational Intelligence Systems, 2014, 7 : 6 - 17
  • [36] Implicit parameter estimation for conditional Gaussian Bayesian networks
    Jarraya, Aida
    Leray, Philippe
    Masmoudi, Afif
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2014, 7 : 6 - 17
  • [37] Concave penalized estimation of sparse Gaussian Bayesian networks
    Aragam, Bryon
    Zhou, Qing
    [J]. Journal of Machine Learning Research, 2015, 16 : 2273 - 2328
  • [38] Parameter Learning for Hybrid Bayesian Networks With Gaussian Mixture and Dirac Mixture Conditional Densities
    Krauthausen, Peter
    Hanebeck, Uwe D.
    [J]. 2010 AMERICAN CONTROL CONFERENCE, 2010, : 480 - 485
  • [39] PARAMETER ESTIMATION OF GAUSSIAN MIXTURE MODEL BASED ON VARIATIONAL BAYESIAN LEARNING
    Zhao, Linchang
    Shang, Zhaowei
    Qin, Anyong
    Tang, Yuan Yan
    [J]. PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOL 1, 2018, : 99 - 104
  • [40] DYNESTY: a dynamic nested sampling package for estimating Bayesian posteriors and evidences
    Speagle, Joshua S.
    [J]. MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2020, 493 (03) : 3132 - 3158