Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels

被引:0
|
作者
Reeves, Galen [1 ,2 ]
Pfister, Henry D. [1 ]
Dytso, Alex [3 ]
机构
[1] Duke Univ, Dept Elect Engn, Durham, NC 27706 USA
[2] Duke Univ, Dept Stat Sci, Durham, NC 27706 USA
[3] Princeton Univ, Dept Elect Engn, Princeton, NJ 08544 USA
关键词
I-MMSE; entropy power inequality; conditional central limit theorem; random matrix theory; compressed sensing; Gaussian logarithmic Sobolev inequality; CAPACITY; CODES;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper focuses on the mutual information and minimum mean-squared error (MMSE) as a function a matrix-valued signal-to-noise ratio (SNR) for a linear Gaussian channel with arbitrary input distribution. As shown by Lamarca, the mutual-information is a concave function of a positive semi-definite matrix, which we call the matrix SNR. This implies that the mapping from the matrix SNR to the MMSE matrix is decreasing monotone. Building upon these functional properties, we start to construct a unifying framework that provides a bridge between classical information-theoretic inequalities, such as the entropy power inequality, and interpolation techniques used in statistical physics and random matrix theory. This framework provides new insight into the structure of phase transitions in coding theory and compressed sensing. In particular, it is shown that the parallel combination of linear channels with freely-independent matrices can be characterized succinctly via free convolution.
引用
收藏
页码:1754 / 1758
页数:5
相关论文
共 50 条
  • [31] SNR Estimation in Linear Systems With Gaussian Matrices
    Suliman, Mohamed A.
    Alrashdi, Ayed M.
    Ballal, Tarig
    Al-Naffouri, Tareq Y.
    IEEE SIGNAL PROCESSING LETTERS, 2017, 24 (12) : 1867 - 1871
  • [32] Higher order asymptotics of mutual information for nonlinear channels with non-gaussian noise
    Prelov, VV
    van der Meulen, EC
    2003 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 2003, : 83 - 83
  • [33] Mutual Information of IID Complex Gaussian Signals on Block Rayleigh-faded Channels
    Rusek, Fredrik
    Lozano, Angel
    Jindal, Nihar
    2010 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2010, : 300 - 304
  • [34] Mutual Information of IID Complex Gaussian Signals on Block Rayleigh-Faded Channels
    Rusek, Fredrik
    Lozano, Angel
    Jindal, Nihar
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2012, 58 (01) : 331 - 340
  • [35] Additive non-Gaussian noise channels:: Mutual information and conditional mean estimation
    Guo, DN
    Shamai, S
    Verdú, S
    2005 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), VOLS 1 AND 2, 2005, : 719 - 723
  • [36] MUTUAL INFORMATION OF MEMORYLESS CHANNELS
    KADOTA, TT
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1971, 17 (02) : 140 - +
  • [37] High-SNR mutual information of dense constellations
    Franceschini, M
    Ferrari, G
    Raheli, R
    GLOBECOM '05: IEEE GLOBAL TELECOMMUNICATIONS CONFERENCE, VOLS 1-6: DISCOVERY PAST AND FUTURE, 2005, : 132 - 136
  • [38] Gaussian Process Optimization with Mutual Information
    Contal, Emile
    Perchet, Vianney
    Vayatis, Nicolas
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 32 (CYCLE 2), 2014, 32 : 253 - 261
  • [39] The Capacity Region of the Gaussian Cognitive Radio Channels at High SNR
    Rini, Stefano
    Tuninetti, Daniela
    Devroye, Natasha
    2009 IEEE INFORMATION THEORY WORKSHOP (ITW 2009), 2009, : 505 - 509
  • [40] Higher-Order Asymptotics of Mutual Information for Nonlinear Channels with Non-Gaussian Noise
    V. V. Prelov
    E. C. van der Meulen
    Problems of Information Transmission, 2003, 39 (4) : 324 - 340