Particle filter with iterative importance sampling for Bayesian networks inference

被引:1
|
作者
Chang, KC [1 ]
He, DH [1 ]
机构
[1] George Mason Univ, Sch Informat Technol & Engn, Dept Syst Engn & Operat Res, Fairfax, VA 22030 USA
关键词
Bayesian networks inference; importance sampling; particle filter;
D O I
10.1117/12.606063
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Bayesian network has been applied widely in many areas such as multi-sensor fusion, situation assessment, and decision making under uncertainty. It is well known that, in general when dealing with large complex networks, the exact probabilistic inference methods are computationally difficult or impossible. To deal with the difficulty, the "anytime" stochastic simulation methods such as likelihood weighting and importance sampling have become popular. In this paper, we introduce a very efficient iterative importance sampling algorithm for Bayesian network inference. Much like the recently popular sequential simulation method, particle filter, this algorithm identifies importance function and conducts sampling iteratively. However, particle filter methods often run into the so called "degeneration" or "impoverishment" problems due to low likely evidence or high dimensional sampling space. To overcome that, this Bayesian network particle filter (BNPF) algorithm decomposes the global state space into local ones based on the network structure and learns the importance function accordingly in an iterative manner. We used large real world Bayesian network models available in academic community to test the inference method. The preliminary simulation results show that the algorithm is very promising.
引用
收藏
页码:313 / 321
页数:9
相关论文
共 50 条
  • [31] Efficient bayes inference in neural networks through adaptive importance sampling
    Huang, Yunshi
    Chouzenoux, Emilie
    Elvira, Victor
    Pesquet, Jean-Christophe
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2023, 360 (16): : 12125 - 12149
  • [32] A Novel Deterministic Mixture Particle Filter Based on Multiple Importance Sampling
    Zheng, Linyao
    Li, Zhi
    Yang, Yanbo
    2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 3579 - 3584
  • [33] Markov Chain Monte Carlo versus Importance Sampling in Bayesian Inference of the GARCH model
    Takaishi, Tetsuya
    17TH INTERNATIONAL CONFERENCE IN KNOWLEDGE BASED AND INTELLIGENT INFORMATION AND ENGINEERING SYSTEMS - KES2013, 2013, 22 : 1056 - 1064
  • [34] A PARTICLE GIBBS SAMPLING APPROACH TO TOPOLOGY INFERENCE IN GENE REGULATORY NETWORKS
    Iloska, Marija
    El-Laham, Yousef
    Bugallo, Monica F.
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 5855 - 5859
  • [35] Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks
    Ramos-López, Darío (dramoslopez@ual.es), 1600, Elsevier Inc. (100):
  • [36] Scalable importance sampling estimation of Gaussian mixture posteriors in Bayesian networks
    Ramos-Lopez, Dario
    Masegosa, Andres R.
    Salmeron, Antonio
    Rumi, Rafael
    Langseth, Helge
    Nielsen, Thomas D.
    Madsen, Anders L.
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2018, 100 : 115 - 134
  • [37] Sampling-Free Variational Inference of Bayesian Neural Networks by Variance Backpropagation
    Haussmann, Manuel
    Hamprecht, Fred A.
    Kandemir, Melih
    35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 563 - 573
  • [38] Bayesian Nonparametric Weighted Sampling Inference
    Si, Yajuan
    Pillai, Natesh S.
    Gelman, Andrew
    BAYESIAN ANALYSIS, 2015, 10 (03): : 605 - 625
  • [39] Bayesian iterative binary filter design
    Kamat, VG
    Dougherty, ER
    NONLINEAR IMAGE PROCESSING AND PATTERN ANALYSIS XII, 2001, 4304 : 197 - 208
  • [40] Implementation of the Auxiliary Sampling Importance Resampling Particle Filter on Graphics Processing Unit
    Dulger, Ozcan
    Oguztuzun, Halit
    2020 5TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ENGINEERING (UBMK), 2020, : 156 - 159