A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications

被引:28
|
作者
Marsiglietti, Arnaud [1 ]
Kostina, Victoria [2 ]
机构
[1] CALTECH, Ctr Math Informat, Pasadena, CA 91125 USA
[2] CALTECH, Dept Elect Engn, Pasadena, CA 91125 USA
基金
美国国家科学基金会;
关键词
differential entropy; reverse entropy power inequality; rate-distortion function; Shannon lower bound; channel capacity; log-concave distribution; hyperplane conjecture;
D O I
10.3390/e20030185
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d(x, (x) over cap) = vertical bar x - (x) over cap vertical bar(r), with r >= 1, and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log(root pi e) approximate to 1.5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log(root pi e/2) approximate to 1 bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most log(root pi e/2) approximate to 1 bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry.
引用
下载
收藏
页数:24
相关论文
共 50 条
  • [1] A lower bound on the differential entropy for log-concave random variables with applications to rate-distortion theory
    Marsiglietti, Arnaud
    Kostina, Victoria
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 46 - 50
  • [2] A reverse entropy power inequality for log-concave random vectors
    Ball, Keith
    Nayar, Piotr
    Tkocz, Tomaz
    STUDIA MATHEMATICA, 2016, 235 (01) : 17 - 30
  • [3] Wasserstein Stability of the Entropy Power Inequality for Log-Concave Random Vectors
    Courtade, Thomas A.
    Fathi, Max
    Pananjady, Ashwin
    2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 659 - 663
  • [4] Entropy jumps for isotropic log-concave random vectors and spectral gap
    Ball, Keith
    Van Hang Nguyen
    STUDIA MATHEMATICA, 2012, 213 (01) : 81 - 96
  • [5] Entropy and Information jump for log-concave vectors
    Bizeul, Pierre
    COMPTES RENDUS MATHEMATIQUE, 2023, 361 (01) : 487 - 493
  • [6] Norms of weighted sums of log-concave random vectors
    Chasapis, Giorgos
    Giannopoulos, Apostolos
    Skarmogiannis, Nikos
    COMMUNICATIONS IN CONTEMPORARY MATHEMATICS, 2020, 22 (04)
  • [7] On Some Problems Concerning Log-Concave Random Vectors
    Latala, Rafal
    CONVEXITY AND CONCENTRATION, 2017, 161 : 525 - 539
  • [8] Tail estimates for norms of sums of log-concave random vectors
    Adamczak, Radoslaw
    Latala, Rafal
    Litvak, Alexander E.
    Pajor, Alain
    Tomczak-Jaegermann, Nicole
    PROCEEDINGS OF THE LONDON MATHEMATICAL SOCIETY, 2014, 108 : 600 - 637
  • [9] An Inequality for Moments of Log-Concave Functions on Gaussian Random Vectors
    Dafnis, Nikos
    Paouris, Grigoris
    GEOMETRIC ASPECTS OF FUNCTIONAL ANALYSIS, 2017, 2169 : 107 - 122
  • [10] A lower-bound on monopoly profit for log-concave demand
    Condorelli, Daniele
    ECONOMICS LETTERS, 2022, 210