Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation

被引:2
|
作者
Scutari, Marco [1 ]
机构
[1] Ist Dalle Molle Studi SullIntelligenza Artificiale, CH-6900 Lugano, Switzerland
关键词
Bayesian networks; Shannon entropy; Kullback-Leibler divergence; PROPAGATION; INFERENCE;
D O I
10.3390/a17010024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian networks (BNs) are a foundational model in machine learning and causal inference. Their graphical structure can handle high-dimensional problems, divide them into a sparse collection of smaller ones, underlies Judea Pearl's causality, and determines their explainability and interpretability. Despite their popularity, there are almost no resources in the literature on how to compute Shannon's entropy and the Kullback-Leibler (KL) divergence for BNs under their most common distributional assumptions. In this paper, we provide computationally efficient algorithms for both by leveraging BNs' graphical structure, and we illustrate them with a complete set of numerical examples. In the process, we show it is possible to reduce the computational complexity of KL from cubic to quadratic for Gaussian BNs.
引用
收藏
页数:32
相关论文
共 50 条
  • [41] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [42] COMPLEX NMF WITH THE GENERALIZED KULLBACK-LEIBLER DIVERGENCE
    Kameoka, Hirokazu
    Kagami, Hideaki
    Yukawa, Masahiro
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 56 - 60
  • [43] The generalized Kullback-Leibler divergence and robust inference
    Park, C
    Basu, A
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2003, 73 (05) : 311 - 332
  • [44] A Kullback-Leibler Divergence Variant of the Bayesian Cram?r-Rao Bound
    Fauss, Michael
    Dytso, Alex
    Poor, H. Vincent
    SIGNAL PROCESSING, 2023, 207
  • [45] Bayesian case influence analysis for GARCH models based on Kullback-Leibler divergence
    Hao, Hong-Xia
    Lin, Jin-Guan
    Wang, Hong-Xia
    Huang, Xing-Fang
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2016, 45 (04) : 595 - 609
  • [46] On calibration of Kullback-Leibler divergence via prediction
    Keyes, TK
    Levy, MS
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 1999, 28 (01) : 67 - 85
  • [47] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340
  • [48] The AIC criterion and symmetrizing the Kullback-Leibler divergence
    Seghouane, Abd-Krim
    Amari, Shun-Ichi
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (01): : 97 - 106
  • [49] On Bayesian selection of the best normal population using the Kullback-Leibler divergence measure
    Thabane, L
    Haq, MS
    STATISTICA NEERLANDICA, 1999, 53 (03) : 342 - 360
  • [50] Entropy production and Kullback-Leibler divergence between stationary trajectories of discrete systems
    Roldan, Edgar
    Parrondo, Juan M. R.
    PHYSICAL REVIEW E, 2012, 85 (03):