Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

被引:37
|
作者
Kontoyiannis, Ioannis [1 ]
Madiman, Mokshay [2 ]
机构
[1] Athens Univ Econ & Business, Dept Informat, Athens 10434, Greece
[2] Univ Delaware, Dept Math Sci, Newark, DE 19716 USA
基金
美国国家科学基金会;
关键词
Shannon entropy; differential entropy; sumset bounds; inequalities; submodularity; data processing; mutual information;
D O I
10.1109/TIT.2014.2322861
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The sumset and inverse sumset theories of Freiman, Plunnecke, and Ruzsa, give bounds connecting the cardinality of the sumset A + B = {a + b; a is an element of A, b is an element of B} of two discrete sets A, B, to the cardinalities (or the finer structure) of the original sets A, B. For example, the sum-difference bound of Ruzsa states that, vertical bar A + B vertical bar vertical bar A vertical bar vertical bar B vertical bar = vertical bar A - B vertical bar(3), where the difference set A - B = {a - b; a is an element of A, b is an element of B}. Interpreting the differential entropy h(X) of a continuous random variable X as (the logarithm of) the size of the effective support of X, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, h(X + Y)+h(X)+h(Y) <= 3h(X - Y), for any pair of independent continuous random variables X and Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plunnecke-Ruzsa inequality, and the Balog-Szemeredi-Gowers lemma. In addition, we give a differential entropy version of a Freiman-type inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.
引用
收藏
页码:4503 / 4514
页数:12
相关论文
共 50 条
  • [31] Entropy Power, Autoregressive Models, and Mutual Information
    Gibson, Jerry
    ENTROPY, 2018, 20 (10)
  • [32] UNCERTAINTY, ENTROPY, AND MUTUAL INFORMATION FOR QUANTUM STATES
    RAI, S
    JOURNAL OF THE OPTICAL SOCIETY OF AMERICA B-OPTICAL PHYSICS, 1992, 9 (04) : 590 - 594
  • [33] Role of mutual information in entropy production under information exchanges
    Sagawa, Takahiro
    Ueda, Masahito
    NEW JOURNAL OF PHYSICS, 2013, 15
  • [34] MUTUAL INFORMATION FOR STOCHASTIC DIFFERENTIAL EQUATIONS
    DUNCAN, TE
    INFORMATION AND CONTROL, 1971, 19 (03): : 265 - &
  • [35] Belavkin-Staszewski Relative Entropy, Conditional Entropy, and Mutual Information
    Zhai, Yuan
    Yang, Bo
    Xi, Zhengjun
    ENTROPY, 2022, 24 (06)
  • [36] On characterization of entropy function via information inequalities
    Zhang, Z
    Yeung, RW
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (04) : 1440 - 1452
  • [37] On characterization of entropy function via information inequalities
    Zhang, Z
    Yeung, RW
    1998 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 1998, : 375 - 375
  • [38] Information Theoretic Proofs of Entropy Power Inequalities
    Rioul, Olivier
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (01) : 33 - 55
  • [39] Renyi mutual information inequalities from Rindler positivity
    Blanco, David
    Lanosa, Leandro
    Leston, Mauricio
    Perez-Nadal, Guillem
    JOURNAL OF HIGH ENERGY PHYSICS, 2019, 2019 (12)
  • [40] On additive-combinatorial affine inequalities for Shannon entropy and differential entropy
    Makkuva, Ashok Vardhan
    Wu, Yihong
    2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 1053 - 1057