Memory AMP for Generalized MIMO: Coding Principle and Information-Theoretic Optimality

被引:2
|
作者
Chen, Yufei [1 ]
Liu, Lei [2 ]
Chi, Yuhao [1 ]
Li, Ying [1 ]
Zhang, Zhaoyang [2 ]
机构
[1] Xidian Univ, State Key Lab Integrated Serv Networks, Xidian 710071, Peoples R China
[2] Zhejiang Univ, Coll Informat Sci & Elect Engn, Zhejiang Prov Key Lab Informat Proc Commun & Netwo, Hangzhou 310007, Zhejiang, Peoples R China
关键词
Memory approximate message passing (MAMP); generalized MIMO (GMIMO); low complexity; capacity optimality; coding principle; orthogonal/vector approximate message passing (OAMP/VAMP); MUTUAL INFORMATION; CAPACITY; SYSTEMS;
D O I
10.1109/TWC.2023.3328361
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
To support complex communication scenarios in next-generation wireless communications, this paper focuses on a generalized MIMO (GMIMO) with practical assumptions, such as massive antennas, practical channel coding, arbitrary input distributions, and general right-unitarily-invariant channel matrices (covering Rayleigh fading, certain ill-conditioned and correlated channel matrices). The orthogonal/vector approximate message passing (OAMP/VAMP) receiver has been proved to be information-theoretically optimal in GMIMO, but it is limited to high-complexity linear minimum mean-square error (LMMSE). To solve this problem, a low-complexity memory approximate message passing (MAMP) receiver has recently been shown to be Bayes optimal but limited to uncoded systems. Therefore, how to design a low-complexity and information-theoretically optimal receiver for GMIMO is still an open issue. To address this issue, this paper proposes an information-theoretically optimal MAMP receiver and investigates its achievable rate analysis and optimal coding principle. Specifically, due to the long-memory linear detection, state evolution (SE) for MAMP is intricately multi-dimensional and cannot be used directly to analyze its achievable rate. To avoid this difficulty, a simplified single-input single-output (SISO) variational SE (VSE) for MAMP is developed by leveraging the SE fixed-point consistent property of MAMP and OAMP/VAMP. The achievable rate of MAMP is calculated using the VSE, and the optimal coding principle is established to maximize the achievable rate. On this basis, the information-theoretic optimality of MAMP is proved rigorously. Furthermore, the simplified SE analysis by fixed-point consistency is generalized to any two iterative detection algorithms with the identical SE fixed point. Numerical results show that the finite-length performances of MAMP with practical optimized low-density parity-check (LDPC) codes are 0.5 similar to 2.7 dB away from the associated constrained capacities. It is worth noting that MAMP can achieve the same performances as OAMP/VAMP with 4 parts per thousand of the time consumption for large-scale systems.
引用
收藏
页码:5769 / 5785
页数:17
相关论文
共 50 条
  • [31] An information-theoretic principle implies that any discrete physical theory is classical
    Corsin Pfister
    Stephanie Wehner
    Nature Communications, 4
  • [32] Complexity principle of extremality in evolution of living organisms by information-theoretic entropy
    Szwast, Z
    Sieniutycz, S
    Shiner, JS
    CHAOS SOLITONS & FRACTALS, 2002, 13 (09) : 1871 - 1888
  • [33] An Information-Theoretic Interpretation of the Maximum Entropy Production Principle for Complex Networks
    Abadi, Noam
    Ruzzenenti, Franco
    SSRN, 2023,
  • [34] An information-theoretic principle implies that any discrete physical theory is classical
    Pfister, Corsin
    Wehner, Stephanie
    NATURE COMMUNICATIONS, 2013, 4
  • [35] Information-Theoretic Approach to Estimating the Capacity of Distributed Memory Systems
    B. Ya. Ryabko
    Problems of Information Transmission, 2018, 54 : 191 - 198
  • [36] Information-Theoretic Approach to Estimating the Capacity of Distributed Memory Systems
    Ryabko, B. Ya.
    PROBLEMS OF INFORMATION TRANSMISSION, 2018, 54 (02) : 191 - 198
  • [37] Information-Theoretic Study of Quality Coding in a Realistic ORN Population Model
    Gutierrez-Galvez, Agustin
    Marco, Santiago
    CHEMICAL SENSES, 2009, 34 (03) : E43 - E43
  • [38] Storage Capacity as an Information-Theoretic Vertex Cover and the Index Coding Rate
    Mazumdar, Arya
    McGregor, Andrew
    Vorotnikova, Sofya
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (09) : 5580 - 5591
  • [39] Information-theoretic analysis of signal processing systems: Application to neural coding
    Johnson, DH
    Gruner, CM
    1998 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY - PROCEEDINGS, 1998, : 22 - 22
  • [40] Information-Theoretic Evidence for Predictive Coding in the Face-Processing System
    Brodski-Guerniero, Alla
    Paasch, Georg-Friedrich
    Wollstadt, Patricia
    Ozdemir, Ipeko
    Lizier, Joseph T.
    Wibral, Michael
    JOURNAL OF NEUROSCIENCE, 2017, 37 (34): : 8273 - 8283