Renyi Divergence and Kullback-Leibler Divergence

被引:903
|
作者
van Erven, Tim [1 ]
Harremoes, Peter [2 ]
机构
[1] Univ Paris 11, Dept Math, F-91405 Orsay, France
[2] Copenhagen Business Coll, DK-1358 Copenhagen, Denmark
关键词
alpha-divergence; Bhattacharyya distance; information divergence; Kullback-Leibler divergence; Pythagorean inequality; Renyi divergence; INFORMATION; ENTROPY; PROBABILITY; DISTRIBUTIONS; CONVERGENCE; STATISTICS; PINSKERS; RATES;
D O I
10.1109/TIT.2014.2320500
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Renyi divergence is related to Renyi entropy much like Kullback-Leibler divergence is related to Shannon's entropy, and comes up in many settings. It was introduced by Renyi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Renyi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of Renyi divergence and Kullback-Leibler divergence, including convexity, continuity, limits of sigma-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.
引用
收藏
页码:3797 / 3820
页数:24
相关论文
共 50 条
  • [41] Detection of Test Collusion via Kullback-Leibler Divergence
    Belov, Dmitry I.
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2013, 50 (02) : 141 - 163
  • [42] Trigonometric Moment Matching and Minimization of the Kullback-Leibler Divergence
    Kurz, Gerhard
    Hanebeck, Uwe D.
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2015, 51 (04) : 3480 - 3484
  • [43] Parameter identifiability with Kullback-Leibler information divergence criterion
    Chen, Badong
    Hu, Jinchun
    Zhu, Yu
    Sun, Zengqi
    INTERNATIONAL JOURNAL OF ADAPTIVE CONTROL AND SIGNAL PROCESSING, 2009, 23 (10) : 940 - 960
  • [44] Guaranteed Bounds on the Kullback-Leibler Divergence of Univariate Mixtures
    Nielsen, Frank
    Sun, Ke
    IEEE SIGNAL PROCESSING LETTERS, 2016, 23 (11) : 1543 - 1546
  • [45] Use of the Kullback-Leibler Divergence in Estimating Clutter Distributions
    Ritchie, M. A.
    Charlish, A.
    Woodbridge, K.
    Stove, A. G.
    2011 IEEE RADAR CONFERENCE (RADAR), 2011, : 751 - 756
  • [46] Ellipticity and Circularity Measuring via Kullback-Leibler Divergence
    Misztal, Krzysztof
    Tabor, Jacek
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2016, 55 (01) : 136 - 150
  • [47] Kullback-Leibler divergence for Bayesian nonparametric model checking
    Al-Labadi, Luai
    Patel, Viskakh
    Vakiloroayaei, Kasra
    Wan, Clement
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2021, 50 (01) : 272 - 289
  • [49] Kullback-Leibler divergence as an estimate of reproducibility of numerical results
    Calvayrac, Florent
    2015 7TH INTERNATIONAL CONFERENCE ON NEW TECHNOLOGIES, MOBILITY AND SECURITY (NTMS), 2015,
  • [50] Kullback-Leibler divergence measure of intermittency: Application to turbulence
    Granero-Belinchon, Carlos
    Roux, Stephane G.
    Garnier, Nicolas B.
    PHYSICAL REVIEW E, 2018, 97 (01):