Bayesian estimation of the Kullback-Leibler divergence for categorical systems using mixtures of Dirichlet priors

被引:0
|
作者
Camaglia, Francesco [1 ,2 ]
Nemenman, Ilya [3 ]
Mora, Thierry [1 ,2 ]
Walczak, Aleksandra M. [1 ,2 ]
机构
[1] Sorbonne Univ, PSL Univ, Lab Phys Ecole Normale Super, CNRS, F-75005 Paris, France
[2] Univ Paris, F-75005 Paris, France
[3] Emory Univ, Dept Phys, Dept Biol & Initiat Theory & Modeling Living Syst, Atlanta, GA 30322 USA
基金
欧洲研究理事会;
关键词
NONPARAMETRIC-ESTIMATION; ENTROPY ESTIMATION; PROBABILITY; INFORMATION; INFERENCE;
D O I
10.1103/PhysRevE.109.024305
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
In many applications in biology, engineering, and economics, identifying similarities and differences between distributions of data from complex processes requires comparing finite categorical samples of discrete counts. Statistical divergences quantify the difference between two distributions. However, their estimation is very difficult and empirical methods often fail, especially when the samples are small. We develop a Bayesian estimator of the Kullback-Leibler divergence between two probability distributions that makes use of a mixture of Dirichlet priors on the distributions being compared. We study the properties of the estimator on two examples: probabilities drawn from Dirichlet distributions and random strings of letters drawn from Markov chains. We extend the approach to the squared Hellinger divergence. Both estimators outperform other estimation techniques, with better results for data with a large number of categories and for higher values of divergences.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] The Kullback-Leibler divergence and nonnegative matrices
    Boche, Holger
    Stanczak, Slawomir
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (12) : 5539 - 5545
  • [32] A decision cognizant Kullback-Leibler divergence
    Ponti, Moacir
    Kittler, Josef
    Riva, Mateus
    de Campos, Teofilo
    Zor, Cemre
    PATTERN RECOGNITION, 2017, 61 : 470 - 478
  • [33] ENHANCEMENT OF INCIPIENT FAULT DETECTION AND ESTIMATION USING THE MULTIVARIATE KULLBACK-LEIBLER DIVERGENCE
    Youssef, Abdulrahman
    Delpha, Claude
    Diallo, Demba
    2016 24TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2016, : 1408 - 1412
  • [34] AN INVOLUTION INEQUALITY FOR THE KULLBACK-LEIBLER DIVERGENCE
    Pinelis, Iosif
    MATHEMATICAL INEQUALITIES & APPLICATIONS, 2017, 20 (01): : 233 - 235
  • [35] Using the Kullback-Leibler Divergence to Combine Image Priors in Super-Resolution Image Reconstruction
    Villena, Salvador
    Vega, Miguel
    Derin Babacan, S.
    Molina, Rafael
    Katsaggelos, Aggelos K.
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 893 - 896
  • [36] Model parameter learning using Kullback-Leibler divergence
    Lin, Chungwei
    Marks, Tim K.
    Pajovic, Milutin
    Watanabe, Shinji
    Tung, Chih-kuan
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 491 : 549 - 559
  • [37] Detecting abnormal situations using the Kullback-Leibler divergence
    Zeng, Jiusun
    Kruger, Uwe
    Geluk, Jaap
    Wang, Xun
    Xie, Lei
    AUTOMATICA, 2014, 50 (11) : 2777 - 2786
  • [38] Fault tolerant learning using Kullback-Leibler divergence
    Sum, John
    Leung, Chi-sing
    Hsu, Lipin
    TENCON 2007 - 2007 IEEE REGION 10 CONFERENCE, VOLS 1-3, 2007, : 1193 - +
  • [39] Distributions of the Kullback-Leibler divergence with applications
    Belov, Dmitry I.
    Armstrong, Ronald D.
    BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2011, 64 (02): : 291 - 309
  • [40] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,