Entropy and the Kullback-Leibler Divergence for Bayesian Networks: Computational Complexity and Efficient Implementation

被引:2
|
作者
Scutari, Marco [1 ]
机构
[1] Ist Dalle Molle Studi SullIntelligenza Artificiale, CH-6900 Lugano, Switzerland
关键词
Bayesian networks; Shannon entropy; Kullback-Leibler divergence; PROPAGATION; INFERENCE;
D O I
10.3390/a17010024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Bayesian networks (BNs) are a foundational model in machine learning and causal inference. Their graphical structure can handle high-dimensional problems, divide them into a sparse collection of smaller ones, underlies Judea Pearl's causality, and determines their explainability and interpretability. Despite their popularity, there are almost no resources in the literature on how to compute Shannon's entropy and the Kullback-Leibler (KL) divergence for BNs under their most common distributional assumptions. In this paper, we provide computationally efficient algorithms for both by leveraging BNs' graphical structure, and we illustrate them with a complete set of numerical examples. In the process, we show it is possible to reduce the computational complexity of KL from cubic to quadratic for Gaussian BNs.
引用
收藏
页数:32
相关论文
共 50 条
  • [21] Efficient distributional reinforcement learning with Kullback-Leibler divergence regularization
    Renxing Li
    Zhiwei Shang
    Chunhua Zheng
    Huiyun Li
    Qing Liang
    Yunduan Cui
    Applied Intelligence, 2023, 53 : 24847 - 24863
  • [22] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235
  • [23] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [24] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [25] ON INFORMATION GAIN, KULLBACK-LEIBLER DIVERGENCE, ENTROPY PRODUCTION AND THE INVOLUTION KERNEL
    Lopes, Artur O.
    Mengue, Jairo K.
    DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS, 2022, 42 (07) : 3593 - 3627
  • [26] HIF detection in distribution networks based on Kullback-Leibler divergence
    Nezamzadeh-Ejieh, Shiva
    Sadeghkhani, Iman
    IET GENERATION TRANSMISSION & DISTRIBUTION, 2020, 14 (01) : 29 - 36
  • [27] Some Order Preserving Inequalities for Cross Entropy and Kullback-Leibler Divergence
    Sbert, Mateu
    Chen, Min
    Poch, Jordi
    Bardera, Anton
    ENTROPY, 2018, 20 (12):
  • [28] Renyi Relative Entropy from Homogeneous Kullback-Leibler Divergence Lagrangian
    Chirco, Goffredo
    GEOMETRIC SCIENCE OF INFORMATION (GSI 2021), 2021, 12829 : 744 - 751
  • [29] Identification of Directed Influence: Granger Causality, Kullback-Leibler Divergence, and Complexity
    Seghouane, Abd-Krim
    Amari, Shun-ichi
    NEURAL COMPUTATION, 2012, 24 (07) : 1722 - 1739
  • [30] Comparing Score-Based Methods for Estimating Bayesian Networks Using the Kullback-Leibler Divergence
    Kasza, Jessica
    Solomon, Patty
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2015, 44 (01) : 135 - 152