Bayesian estimation of the Kullback-Leibler divergence for categorical systems using mixtures of Dirichlet priors

被引:0
|
作者
Camaglia, Francesco [1 ,2 ]
Nemenman, Ilya [3 ]
Mora, Thierry [1 ,2 ]
Walczak, Aleksandra M. [1 ,2 ]
机构
[1] Sorbonne Univ, PSL Univ, Lab Phys Ecole Normale Super, CNRS, F-75005 Paris, France
[2] Univ Paris, F-75005 Paris, France
[3] Emory Univ, Dept Phys, Dept Biol & Initiat Theory & Modeling Living Syst, Atlanta, GA 30322 USA
基金
欧洲研究理事会;
关键词
NONPARAMETRIC-ESTIMATION; ENTROPY ESTIMATION; PROBABILITY; INFORMATION; INFERENCE;
D O I
10.1103/PhysRevE.109.024305
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
In many applications in biology, engineering, and economics, identifying similarities and differences between distributions of data from complex processes requires comparing finite categorical samples of discrete counts. Statistical divergences quantify the difference between two distributions. However, their estimation is very difficult and empirical methods often fail, especially when the samples are small. We develop a Bayesian estimator of the Kullback-Leibler divergence between two probability distributions that makes use of a mixture of Dirichlet priors on the distributions being compared. We study the properties of the estimator on two examples: probabilities drawn from Dirichlet distributions and random strings of letters drawn from Markov chains. We extend the approach to the squared Hellinger divergence. Both estimators outperform other estimation techniques, with better results for data with a large number of categories and for higher values of divergences.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Estimation of quality of service in spelling correction using Kullback-Leibler divergence
    Varol, Cihan
    Bayrak, Coskun
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (05) : 6307 - 6312
  • [22] On Bayesian selection of the best normal population using the Kullback-Leibler divergence measure
    Thabane, L
    Haq, MS
    STATISTICA NEERLANDICA, 1999, 53 (03) : 342 - 360
  • [23] Model Fusion with Kullback-Leibler Divergence
    Claici, Sebastian
    Yurochkin, Mikhail
    Ghosh, Soumya
    Solomon, Justin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [24] Kullback-Leibler Divergence Metric Learning
    Ji, Shuyi
    Zhang, Zizhao
    Ying, Shihui
    Wang, Liejun
    Zhao, Xibin
    Gao, Yue
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (04) : 2047 - 2058
  • [25] DETECTING RARE EVENTS USING KULLBACK-LEIBLER DIVERGENCE
    Xu, Jingxin
    Denman, Simon
    Fookes, Clinton
    Sridharan, Sridha
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 1305 - 1309
  • [26] Damage detection using the improved Kullback-Leibler divergence
    Tian, Shaohua
    Chen, Xuefeng
    Yang, Zhibo
    He, Zhengjia
    Zhang, Xingwu
    STRUCTURAL ENGINEERING AND MECHANICS, 2013, 48 (03) : 291 - 308
  • [27] Use of Kullback-Leibler divergence for forgetting
    Karny, Miroslav
    Andrysek, Josef
    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 961 - 975
  • [28] Source Resolvability with Kullback-Leibler Divergence
    Nomura, Ryo
    2018 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2018, : 2042 - 2046
  • [29] Kullback-Leibler divergence for evaluating bioequivalence
    Dragalin, V
    Fedorov, V
    Patterson, S
    Jones, B
    STATISTICS IN MEDICINE, 2003, 22 (06) : 913 - 930
  • [30] Kullback-Leibler divergence: A quantile approach
    Sankaran, P. G.
    Sunoj, S. M.
    Nair, N. Unnikrishnan
    STATISTICS & PROBABILITY LETTERS, 2016, 111 : 72 - 79