Mutual Information as a Function of Matrix SNR for Linear Gaussian Channels

被引:0
|
作者
Reeves, Galen [1 ,2 ]
Pfister, Henry D. [1 ]
Dytso, Alex [3 ]
机构
[1] Duke Univ, Dept Elect Engn, Durham, NC 27706 USA
[2] Duke Univ, Dept Stat Sci, Durham, NC 27706 USA
[3] Princeton Univ, Dept Elect Engn, Princeton, NJ 08544 USA
关键词
I-MMSE; entropy power inequality; conditional central limit theorem; random matrix theory; compressed sensing; Gaussian logarithmic Sobolev inequality; CAPACITY; CODES;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper focuses on the mutual information and minimum mean-squared error (MMSE) as a function a matrix-valued signal-to-noise ratio (SNR) for a linear Gaussian channel with arbitrary input distribution. As shown by Lamarca, the mutual-information is a concave function of a positive semi-definite matrix, which we call the matrix SNR. This implies that the mapping from the matrix SNR to the MMSE matrix is decreasing monotone. Building upon these functional properties, we start to construct a unifying framework that provides a bridge between classical information-theoretic inequalities, such as the entropy power inequality, and interpolation techniques used in statistical physics and random matrix theory. This framework provides new insight into the structure of phase transitions in coding theory and compressed sensing. In particular, it is shown that the parallel combination of linear channels with freely-independent matrices can be characterized succinctly via free convolution.
引用
收藏
页码:1754 / 1758
页数:5
相关论文
共 50 条
  • [1] Gradient of mutual information in linear vector Gaussian channels
    Palomar, DP
    Verdú, S
    2005 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), VOLS 1 AND 2, 2005, : 705 - 708
  • [2] Gradient of mutual information in linear vector Gaussian channels
    Palomar, DP
    Verdú, S
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2006, 52 (01) : 141 - 154
  • [3] A Bregman Matrix and the Gradient of Mutual Information for Vector Poisson and Gaussian Channels
    Wang, Liming
    Carlson, David Edwin
    Rodrigues, Miguel R. D.
    Calderbank, Robert
    Carin, Lawrence
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (05) : 2611 - 2629
  • [4] Derivatives of Mutual Information in Gaussian Channels
    Nguyen, Minh-Toan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (11) : 7525 - 7531
  • [5] Mutual information and MMSE in Gaussian channels
    Guo, DN
    Shamai, S
    Verdú, S
    2004 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, PROCEEDINGS, 2004, : 347 - 347
  • [6] NOTE ON MUTUAL INFORMATION IN WHITE GAUSSIAN CHANNELS WITH NON-LINEAR FEEDBACK
    UCHIDA, K
    SHIMEMURA, E
    INFORMATION AND CONTROL, 1978, 37 (02): : 178 - 181
  • [7] Gradient of Mutual Information in Linear Vector Gaussian Channels in the Presence of Input Noise
    Coutts, Fraser K.
    Thompson, John
    Mulgrew, Bernard
    28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 2264 - 2268
  • [8] Mutual Information Bounds for MIMO Gaussian Channels
    Bai, Dongwoon
    Lee, Jungwon
    2015 IEEE 82ND VEHICULAR TECHNOLOGY CONFERENCE (VTC FALL), 2015,
  • [9] Derivative of Mutual Information at Zero SNR: The Gaussian-Noise Case
    Wu, Yihong
    Guo, Dongning
    Verdu, Sergio
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (11) : 7307 - 7312
  • [10] Derivatives of mutual information in Gaussian vector channels with applications
    Feiten, Anke
    Hanly, Stephen
    Mathar, Rudolf
    2007 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-7, 2007, : 2296 - +