Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation

被引:2
|
作者
Scutari, Marco [1 ]
机构
[1] Ist Dalle Molle Studi SullIntelligenza Artificiale, CH-6900 Lugano, Switzerland
关键词
Bayesian networks; Shannon entropy; Kullback-Leibler divergence; PROPAGATION; INFERENCE;
D O I
10.3390/a17010024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian networks (BNs) are a foundational model in machine learning and causal inference. Their graphical structure can handle high-dimensional problems, divide them into a sparse collection of smaller ones, underlies Judea Pearl's causality, and determines their explainability and interpretability. Despite their popularity, there are almost no resources in the literature on how to compute Shannon's entropy and the Kullback-Leibler (KL) divergence for BNs under their most common distributional assumptions. In this paper, we provide computationally efficient algorithms for both by leveraging BNs' graphical structure, and we illustrate them with a complete set of numerical examples. In the process, we show it is possible to reduce the computational complexity of KL from cubic to quadratic for Gaussian BNs.
引用
收藏
页数:32
相关论文
共 50 条
  • [1] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    ENTROPY, 2021, 23 (09)
  • [2] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [3] Renyi Divergence and Kullback-Leibler Divergence
    van Erven, Tim
    Harremoes, Peter
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3797 - 3820
  • [4] Unifying Computational Entropies via Kullback-Leibler Divergence
    Agrawal, Rohit
    Chen, Yi-Hsiu
    Horel, Thibaut
    Vadhan, Salil
    ADVANCES IN CRYPTOLOGY - CRYPTO 2019, PT II, 2019, 11693 : 831 - 858
  • [5] The fractional Kullback-Leibler divergence
    Alexopoulos, A.
    JOURNAL OF PHYSICS A-MATHEMATICAL AND THEORETICAL, 2021, 54 (07)
  • [6] BOUNDS FOR KULLBACK-LEIBLER DIVERGENCE
    Popescu, Pantelimon G.
    Dragomir, Sever S.
    Slusanschi, Emil I.
    Stanasila, Octavian N.
    ELECTRONIC JOURNAL OF DIFFERENTIAL EQUATIONS, 2016,
  • [7] On the Interventional Kullback-Leibler Divergence
    Wildberger, Jonas
    Guo, Siyuan
    Bhattacharyya, Arnab
    Schoelkopf, Bernhard
    CONFERENCE ON CAUSAL LEARNING AND REASONING, VOL 213, 2023, 213 : 328 - 349
  • [8] Kullback-Leibler Divergence Revisited
    Raiber, Fiana
    Kurland, Oren
    ICTIR'17: PROCEEDINGS OF THE 2017 ACM SIGIR INTERNATIONAL CONFERENCE THEORY OF INFORMATION RETRIEVAL, 2017, : 117 - 124
  • [9] Approximate Bayesian Computation with Kullback-Leibler Divergence as Data Discrepancy
    Jiang, Bai
    Wu, Tung-Yu
    Wong, Wing Hung
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 84, 2018, 84
  • [10] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119