Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

被引:37
|
作者
Kontoyiannis, Ioannis [1 ]
Madiman, Mokshay [2 ]
机构
[1] Athens Univ Econ & Business, Dept Informat, Athens 10434, Greece
[2] Univ Delaware, Dept Math Sci, Newark, DE 19716 USA
基金
美国国家科学基金会;
关键词
Shannon entropy; differential entropy; sumset bounds; inequalities; submodularity; data processing; mutual information;
D O I
10.1109/TIT.2014.2322861
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The sumset and inverse sumset theories of Freiman, Plunnecke, and Ruzsa, give bounds connecting the cardinality of the sumset A + B = {a + b; a is an element of A, b is an element of B} of two discrete sets A, B, to the cardinalities (or the finer structure) of the original sets A, B. For example, the sum-difference bound of Ruzsa states that, vertical bar A + B vertical bar vertical bar A vertical bar vertical bar B vertical bar = vertical bar A - B vertical bar(3), where the difference set A - B = {a - b; a is an element of A, b is an element of B}. Interpreting the differential entropy h(X) of a continuous random variable X as (the logarithm of) the size of the effective support of X, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, h(X + Y)+h(X)+h(Y) <= 3h(X - Y), for any pair of independent continuous random variables X and Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plunnecke-Ruzsa inequality, and the Balog-Szemeredi-Gowers lemma. In addition, we give a differential entropy version of a Freiman-type inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.
引用
收藏
页码:4503 / 4514
页数:12
相关论文
共 50 条
  • [41] Copula-Based Mutual Information Measures and Mutual Entropy: A Brief Survey
    Ghosh, Indranil
    Sunoj, S. M.
    MATHEMATICAL METHODS OF STATISTICS, 2024, 33 (03) : 297 - 309
  • [42] Mutual Information Driven Inverse Consistent Nonlinear Registration
    Tao, Guozhi
    He, Renjie
    Datta, Sushmita
    Narayana, Ponnada A.
    2008 30TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY, VOLS 1-8, 2008, : 3957 - 3960
  • [43] Mutual Information and Relative Entropy of Sequential Effect Algebras
    汪加梅
    武俊德
    Cho Minhyung
    Communications in Theoretical Physics, 2010, 54 (08) : 215 - 218
  • [44] The effect of anisotropy on holographic entanglement entropy and mutual information
    Liu, Peng
    Niu, Chao
    Wu, Jian-Pin
    PHYSICS LETTERS B, 2019, 796 : 155 - 161
  • [45] Efficient Approximate Algorithms for Empirical Entropy and Mutual Information
    Chen, Xingguang
    Wang, Sibo
    SIGMOD '21: PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2021, : 274 - 286
  • [46] Mutual information entropy research on dementia EEG signals
    Qi, HZ
    Wan, BK
    Zhao, L
    FOURTH INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY, PROCEEDINGS, 2004, : 885 - 889
  • [47] Remarks on Renyi versions of conditional entropy and mutual information
    Aishwarya, Gautam
    Madiman, Mokshay
    2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1117 - 1121
  • [48] Mutual Information, Relative Entropy, and Estimation in the Poisson Channel
    Atar, Rami
    Weissman, Tsachy
    2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011, : 708 - 712
  • [49] CONDITIONAL ENTROPY AND MUTUAL INFORMATION IN RANDOM CASCADING PROCESSES
    WU, YF
    LIU, LS
    PHYSICAL REVIEW D, 1991, 43 (09): : 3077 - 3079
  • [50] Entropy and mutual information in models of deep neural networks
    Gabrie, Marylou
    Manoel, Andre
    Luneau, Clement
    Barbier, Jean
    Macris, Nicolas
    Krzakala, Florent
    Zdeborova, Lenka
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31