Sumset Inequalities for Differential Entropy and Mutual Information

被引:0
|
作者
Kontoyiannis, Ioannis [1 ]
Madiman, Mokshay [1 ]
机构
[1] Athens Univ Econ & Business, Dept Informat, Patiss 76, Athens 10434, Greece
关键词
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The Plunnecke-Ruzsa sumset theory gives bounds connecting the cardinality of the sumset A + B defined as {alpha + b; a is an element of A, b is an element of B} with the cardinalities of the original sets A, B. For example, the sum-difference bound states that, vertical bar A + B vertical bar vertical bar A vertical bar vertical bar B vertical bar <= vertical bar A - B vertical bar(3), where A - B = {alpha - b; a is an element of A, b is an element of B}. Interpreting the differential entropy h(X) as (the logarithm of) the size of the effective support of X, the main results here are a series of natural information-theoretic analogs for these bounds. For example, the sum-difference bound becomes the new inequality, h(X + Y) + h(X) + h(Y) <= 3h(X - Y), for independent X, Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plunnecke-Ruzsa inequality, and the Balog-Szemeredi-Gowers lemma. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in several cases requiring substantially new proof strategies. The basic property that naturally replaces functional submodularity is the data processing property of mutual information.
引用
收藏
页数:5
相关论文
共 50 条
  • [41] Remarks on Renyi versions of conditional entropy and mutual information
    Aishwarya, Gautam
    Madiman, Mokshay
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1117 - 1121
  • [42] Mutual Information, Relative Entropy, and Estimation in the Poisson Channel
    Atar, Rami
    Weissman, Tsachy
    2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011, : 708 - 712
  • [43] CONDITIONAL ENTROPY AND MUTUAL INFORMATION IN RANDOM CASCADING PROCESSES
    WU, YF
    LIU, LS
    PHYSICAL REVIEW D, 1991, 43 (09): : 3077 - 3079
  • [44] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [45] IT Formulae for Gamma Target: Mutual Information and Relative Entropy
    Arras, Benjamin
    Swan, Yvik
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (02) : 1083 - 1091
  • [46] Exponential weighted entropy and exponential weighted mutual information
    Yu, Shiwei
    Huang, Ting-Zhu
    NEUROCOMPUTING, 2017, 249 : 86 - 94
  • [47] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [48] Mutual Information and Relative Entropy of Sequential Effect Algebras
    Wang Jia-Mei
    Wu Jun-De
    Minhyung, Cho
    COMMUNICATIONS IN THEORETICAL PHYSICS, 2010, 54 (02) : 215 - 218
  • [49] REMARKS ON FREE MUTUAL INFORMATION AND ORBITAL FREE ENTROPY
    Izumi, Masaki
    Ueda, Yoshimichi
    NAGOYA MATHEMATICAL JOURNAL, 2015, 220 : 45 - 66
  • [50] Mutual information matrix based on Renyi entropy and application
    Contreras-Reyes, Javier E.
    NONLINEAR DYNAMICS, 2022, 110 (01) : 623 - 633