Sumset Inequalities for Differential Entropy and Mutual Information

被引:0
|
作者
Kontoyiannis, Ioannis [1 ]
Madiman, Mokshay [1 ]
机构
[1] Athens Univ Econ & Business, Dept Informat, Patiss 76, Athens 10434, Greece
关键词
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The Plunnecke-Ruzsa sumset theory gives bounds connecting the cardinality of the sumset A + B defined as {alpha + b; a is an element of A, b is an element of B} with the cardinalities of the original sets A, B. For example, the sum-difference bound states that, vertical bar A + B vertical bar vertical bar A vertical bar vertical bar B vertical bar <= vertical bar A - B vertical bar(3), where A - B = {alpha - b; a is an element of A, b is an element of B}. Interpreting the differential entropy h(X) as (the logarithm of) the size of the effective support of X, the main results here are a series of natural information-theoretic analogs for these bounds. For example, the sum-difference bound becomes the new inequality, h(X + Y) + h(X) + h(Y) <= 3h(X - Y), for independent X, Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plunnecke-Ruzsa inequality, and the Balog-Szemeredi-Gowers lemma. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in several cases requiring substantially new proof strategies. The basic property that naturally replaces functional submodularity is the data processing property of mutual information.
引用
收藏
页数:5
相关论文
共 50 条
  • [31] On characterization of entropy function via information inequalities
    Zhang, Z
    Yeung, RW
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1998, 44 (04) : 1440 - 1452
  • [32] On characterization of entropy function via information inequalities
    Zhang, Z
    Yeung, RW
    1998 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 1998, : 375 - 375
  • [33] Information Theoretic Proofs of Entropy Power Inequalities
    Rioul, Olivier
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (01) : 33 - 55
  • [34] Renyi mutual information inequalities from Rindler positivity
    Blanco, David
    Lanosa, Leandro
    Leston, Mauricio
    Perez-Nadal, Guillem
    JOURNAL OF HIGH ENERGY PHYSICS, 2019, 2019 (12)
  • [35] On additive-combinatorial affine inequalities for Shannon entropy and differential entropy
    Makkuva, Ashok Vardhan
    Wu, Yihong
    2016 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2016, : 1053 - 1057
  • [36] Copula-Based Mutual Information Measures and Mutual Entropy: A Brief Survey
    Ghosh, Indranil
    Sunoj, S. M.
    MATHEMATICAL METHODS OF STATISTICS, 2024, 33 (03) : 297 - 309
  • [37] Mutual Information and Relative Entropy of Sequential Effect Algebras
    汪加梅
    武俊德
    Cho Minhyung
    Communications in Theoretical Physics, 2010, 54 (08) : 215 - 218
  • [38] The effect of anisotropy on holographic entanglement entropy and mutual information
    Liu, Peng
    Niu, Chao
    Wu, Jian-Pin
    PHYSICS LETTERS B, 2019, 796 : 155 - 161
  • [39] Efficient Approximate Algorithms for Empirical Entropy and Mutual Information
    Chen, Xingguang
    Wang, Sibo
    SIGMOD '21: PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2021, : 274 - 286
  • [40] Mutual information entropy research on dementia EEG signals
    Qi, HZ
    Wan, BK
    Zhao, L
    FOURTH INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY, PROCEEDINGS, 2004, : 885 - 889