Bayesian estimation of the Kullback-Leibler divergence for categorical systems using mixtures of Dirichlet priors

被引:0
|
作者
Camaglia, Francesco [1 ,2 ]
Nemenman, Ilya [3 ]
Mora, Thierry [1 ,2 ]
Walczak, Aleksandra M. [1 ,2 ]
机构
[1] Sorbonne Univ, PSL Univ, Lab Phys Ecole Normale Super, CNRS, F-75005 Paris, France
[2] Univ Paris, F-75005 Paris, France
[3] Emory Univ, Dept Phys, Dept Biol & Initiat Theory & Modeling Living Syst, Atlanta, GA 30322 USA
基金
欧洲研究理事会;
关键词
NONPARAMETRIC-ESTIMATION; ENTROPY ESTIMATION; PROBABILITY; INFORMATION; INFERENCE;
D O I
10.1103/PhysRevE.109.024305
中图分类号
O35 [流体力学]; O53 [等离子体物理学];
学科分类号
070204 ; 080103 ; 080704 ;
摘要
In many applications in biology, engineering, and economics, identifying similarities and differences between distributions of data from complex processes requires comparing finite categorical samples of discrete counts. Statistical divergences quantify the difference between two distributions. However, their estimation is very difficult and empirical methods often fail, especially when the samples are small. We develop a Bayesian estimator of the Kullback-Leibler divergence between two probability distributions that makes use of a mixture of Dirichlet priors on the distributions being compared. We study the properties of the estimator on two examples: probabilities drawn from Dirichlet distributions and random strings of letters drawn from Markov chains. We extend the approach to the squared Hellinger divergence. Both estimators outperform other estimation techniques, with better results for data with a large number of categories and for higher values of divergences.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Nonparametric Estimation of Kullback-Leibler Divergence
    Zhang, Zhiyi
    Grabchak, Michael
    NEURAL COMPUTATION, 2014, 26 (11) : 2570 - 2593
  • [2] Statistical Estimation of the Kullback-Leibler Divergence
    Bulinski, Alexander
    Dimitrov, Denis
    MATHEMATICS, 2021, 9 (05) : 1 - 36
  • [3] Computation of Kullback-Leibler Divergence in Bayesian Networks
    Moral, Serafin
    Cano, Andres
    Gomez-Olmedo, Manuel
    ENTROPY, 2021, 23 (09)
  • [4] Kullback-Leibler Divergence Estimation of Continuous Distributions
    Perez-Cruz, Fernando
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1666 - 1670
  • [5] On priors with a Kullback-Leibler property
    Walker, S
    Damien, P
    Lenk, P
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2004, 99 (466) : 404 - 408
  • [6] Minimization of the Kullback-Leibler Divergence for Nonlinear Estimation
    Darling, Jacob E.
    DeMars, Kyle J.
    JOURNAL OF GUIDANCE CONTROL AND DYNAMICS, 2017, 40 (07) : 1739 - 1748
  • [7] MINIMIZATION OF THE KULLBACK-LEIBLER DIVERGENCE FOR NONLINEAR ESTIMATION
    Darling, Jacob E.
    DeMars, Kyle J.
    ASTRODYNAMICS 2015, 2016, 156 : 213 - 232
  • [8] Estimation of discrepancy of color qualia using Kullback-Leibler divergence
    Yamada, Miku
    Matsumoto, Miu
    Arakaki, Mina
    Hebishima, Hana
    Inage, Shinichi
    BIOSYSTEMS, 2023, 232
  • [9] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [10] Estimation of kullback-leibler divergence by local likelihood
    Lee, Young Kyung
    Park, Byeong U.
    ANNALS OF THE INSTITUTE OF STATISTICAL MATHEMATICS, 2006, 58 (02) : 327 - 340