Sumset Inequalities for Differential Entropy and Mutual Information

被引:0
|
作者
Kontoyiannis, Ioannis [1 ]
Madiman, Mokshay [1 ]
机构
[1] Athens Univ Econ & Business, Dept Informat, Patiss 76, Athens 10434, Greece
关键词
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The Plunnecke-Ruzsa sumset theory gives bounds connecting the cardinality of the sumset A + B defined as {alpha + b; a is an element of A, b is an element of B} with the cardinalities of the original sets A, B. For example, the sum-difference bound states that, vertical bar A + B vertical bar vertical bar A vertical bar vertical bar B vertical bar <= vertical bar A - B vertical bar(3), where A - B = {alpha - b; a is an element of A, b is an element of B}. Interpreting the differential entropy h(X) as (the logarithm of) the size of the effective support of X, the main results here are a series of natural information-theoretic analogs for these bounds. For example, the sum-difference bound becomes the new inequality, h(X + Y) + h(X) + h(Y) <= 3h(X - Y), for independent X, Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plunnecke-Ruzsa inequality, and the Balog-Szemeredi-Gowers lemma. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in several cases requiring substantially new proof strategies. The basic property that naturally replaces functional submodularity is the data processing property of mutual information.
引用
收藏
页数:5
相关论文
共 50 条
  • [21] ENTROPY AND MUTUAL INFORMATION OF EXPERIMENTS IN THE FUZZY CASE
    Markechova, Dagmar
    NEURAL NETWORK WORLD, 2013, 23 (04) : 339 - 349
  • [22] Accuracy of joint entropy and mutual information estimates
    Bazsó, F
    Petróczi, A
    Zalányi, L
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2843 - 2846
  • [23] Convexity/Concavity of Renyi Entropy and α-Mutual Information
    Ho, Siu-Wai
    Verdu, Sergio
    2015 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2015, : 745 - 749
  • [24] On some extremal problems for mutual information and entropy
    Prelov, V. V.
    PROBLEMS OF INFORMATION TRANSMISSION, 2016, 52 (04) : 319 - 328
  • [25] Estimating the errors on measured entropy and mutual information
    Roulston, MS
    PHYSICA D, 1999, 125 (3-4): : 285 - 294
  • [26] Entropy Power, Autoregressive Models, and Mutual Information
    Gibson, Jerry
    ENTROPY, 2018, 20 (10)
  • [27] UNCERTAINTY, ENTROPY, AND MUTUAL INFORMATION FOR QUANTUM STATES
    RAI, S
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA B-OPTICAL PHYSICS, 1992, 9 (04) : 590 - 594
  • [28] Role of mutual information in entropy production under information exchanges
    Sagawa, Takahiro
    Ueda, Masahito
    NEW JOURNAL OF PHYSICS, 2013, 15
  • [29] MUTUAL INFORMATION FOR STOCHASTIC DIFFERENTIAL EQUATIONS
    DUNCAN, TE
    INFORMATION AND CONTROL, 1971, 19 (03): : 265 - &
  • [30] Belavkin-Staszewski Relative Entropy, Conditional Entropy, and Mutual Information
    Zhai, Yuan
    Yang, Bo
    Xi, Zhengjun
    ENTROPY, 2022, 24 (06)