Sumset and Inverse Sumset Inequalities for Differential Entropy and Mutual Information

被引:37
|
作者
Kontoyiannis, Ioannis [1 ]
Madiman, Mokshay [2 ]
机构
[1] Athens Univ Econ & Business, Dept Informat, Athens 10434, Greece
[2] Univ Delaware, Dept Math Sci, Newark, DE 19716 USA
基金
美国国家科学基金会;
关键词
Shannon entropy; differential entropy; sumset bounds; inequalities; submodularity; data processing; mutual information;
D O I
10.1109/TIT.2014.2322861
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The sumset and inverse sumset theories of Freiman, Plunnecke, and Ruzsa, give bounds connecting the cardinality of the sumset A + B = {a + b; a is an element of A, b is an element of B} of two discrete sets A, B, to the cardinalities (or the finer structure) of the original sets A, B. For example, the sum-difference bound of Ruzsa states that, vertical bar A + B vertical bar vertical bar A vertical bar vertical bar B vertical bar = vertical bar A - B vertical bar(3), where the difference set A - B = {a - b; a is an element of A, b is an element of B}. Interpreting the differential entropy h(X) of a continuous random variable X as (the logarithm of) the size of the effective support of X, the main contribution of this paper is a series of natural information-theoretic analogs for these results. For example, the Ruzsa sum-difference bound becomes the new inequality, h(X + Y)+h(X)+h(Y) <= 3h(X - Y), for any pair of independent continuous random variables X and Y. Our results include differential-entropy versions of Ruzsa's triangle inequality, the Plunnecke-Ruzsa inequality, and the Balog-Szemeredi-Gowers lemma. In addition, we give a differential entropy version of a Freiman-type inverse-sumset theorem, which can be seen as a quantitative converse to the entropy power inequality. Versions of most of these results for the discrete entropy H(X) were recently proved by Tao, relying heavily on a strong, functional form of the submodularity property of H(X). Since differential entropy is not functionally submodular, in the continuous case many of the corresponding discrete proofs fail, in many cases requiring substantially new proof strategies. We find that the basic property that naturally replaces the discrete functional submodularity, is the data processing property of mutual information.
引用
收藏
页码:4503 / 4514
页数:12
相关论文
共 50 条
  • [21] Mutual, information, metric entropy and cumulative relative entropy risk
    Haussler, D
    Opper, M
    ANNALS OF STATISTICS, 1997, 25 (06): : 2451 - 2492
  • [22] Entropy rate estimates from mutual information
    Wissman, B. D.
    McKay-Jones, L. C.
    Binder, P. -M.
    PHYSICAL REVIEW E, 2011, 84 (04):
  • [23] On some extremal problems for mutual information and entropy
    V. V. Prelov
    Problems of Information Transmission, 2016, 52 : 319 - 328
  • [24] Mutual information estimation based on Copula entropy
    Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian Liaoning 116023, China
    Kong Zhi Li Lun Yu Ying Yong, 2013, 7 (875-879):
  • [25] ON ESTIMATION OF ENTROPY AND MUTUAL INFORMATION OF CONTINUOUS DISTRIBUTIONS
    MODDEMEIJER, R
    SIGNAL PROCESSING, 1989, 16 (03) : 233 - 248
  • [26] ENTROPY AND MUTUAL INFORMATION OF EXPERIMENTS IN THE FUZZY CASE
    Markechova, Dagmar
    NEURAL NETWORK WORLD, 2013, 23 (04) : 339 - 349
  • [27] Accuracy of joint entropy and mutual information estimates
    Bazsó, F
    Petróczi, A
    Zalányi, L
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2843 - 2846
  • [28] Convexity/Concavity of Renyi Entropy and α-Mutual Information
    Ho, Siu-Wai
    Verdu, Sergio
    2015 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2015, : 745 - 749
  • [29] On some extremal problems for mutual information and entropy
    Prelov, V. V.
    PROBLEMS OF INFORMATION TRANSMISSION, 2016, 52 (04) : 319 - 328
  • [30] Estimating the errors on measured entropy and mutual information
    Roulston, MS
    PHYSICA D, 1999, 125 (3-4): : 285 - 294