Memory AMP for Generalized MIMO: Coding Principle and Information-Theoretic Optimality

被引:2
|
作者
Chen, Yufei [1 ]
Liu, Lei [2 ]
Chi, Yuhao [1 ]
Li, Ying [1 ]
Zhang, Zhaoyang [2 ]
机构
[1] Xidian Univ, State Key Lab Integrated Serv Networks, Xidian 710071, Peoples R China
[2] Zhejiang Univ, Coll Informat Sci & Elect Engn, Zhejiang Prov Key Lab Informat Proc Commun & Netwo, Hangzhou 310007, Zhejiang, Peoples R China
关键词
Memory approximate message passing (MAMP); generalized MIMO (GMIMO); low complexity; capacity optimality; coding principle; orthogonal/vector approximate message passing (OAMP/VAMP); MUTUAL INFORMATION; CAPACITY; SYSTEMS;
D O I
10.1109/TWC.2023.3328361
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
To support complex communication scenarios in next-generation wireless communications, this paper focuses on a generalized MIMO (GMIMO) with practical assumptions, such as massive antennas, practical channel coding, arbitrary input distributions, and general right-unitarily-invariant channel matrices (covering Rayleigh fading, certain ill-conditioned and correlated channel matrices). The orthogonal/vector approximate message passing (OAMP/VAMP) receiver has been proved to be information-theoretically optimal in GMIMO, but it is limited to high-complexity linear minimum mean-square error (LMMSE). To solve this problem, a low-complexity memory approximate message passing (MAMP) receiver has recently been shown to be Bayes optimal but limited to uncoded systems. Therefore, how to design a low-complexity and information-theoretically optimal receiver for GMIMO is still an open issue. To address this issue, this paper proposes an information-theoretically optimal MAMP receiver and investigates its achievable rate analysis and optimal coding principle. Specifically, due to the long-memory linear detection, state evolution (SE) for MAMP is intricately multi-dimensional and cannot be used directly to analyze its achievable rate. To avoid this difficulty, a simplified single-input single-output (SISO) variational SE (VSE) for MAMP is developed by leveraging the SE fixed-point consistent property of MAMP and OAMP/VAMP. The achievable rate of MAMP is calculated using the VSE, and the optimal coding principle is established to maximize the achievable rate. On this basis, the information-theoretic optimality of MAMP is proved rigorously. Furthermore, the simplified SE analysis by fixed-point consistency is generalized to any two iterative detection algorithms with the identical SE fixed point. Numerical results show that the finite-length performances of MAMP with practical optimized low-density parity-check (LDPC) codes are 0.5 similar to 2.7 dB away from the associated constrained capacities. It is worth noting that MAMP can achieve the same performances as OAMP/VAMP with 4 parts per thousand of the time consumption for large-scale systems.
引用
收藏
页码:5769 / 5785
页数:17
相关论文
共 50 条
  • [1] Low-Complexity and Information-Theoretic Optimal Memory AMP for Coded Generalized MIMO
    Chen, Yufei
    Liu, Lei
    Chi, Yuhao
    Li, Ying
    Zhang, Zhaoyang
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5305 - 5310
  • [2] Predictive coding and the slowness principle: An information-theoretic approach
    Creutzig, Felix
    Sprekeler, Henning
    NEURAL COMPUTATION, 2008, 20 (04) : 1026 - 1041
  • [3] Information-Theoretic Bounded Rationality and ε-Optimality
    Braun, Daniel A.
    Ortega, Pedro A.
    ENTROPY, 2014, 16 (08) : 4662 - 4676
  • [4] Bounded Rationality, Abstraction, and Hierarchical Decision-Making: An Information-Theoretic Optimality Principle
    Genewein, Tim
    Leibfried, Felix
    Grau-Moya, Jordi
    Braun, Daniel Alexander
    FRONTIERS IN ROBOTICS AND AI, 2015,
  • [5] Information-theoretic analysis of neural coding
    Johnson, DH
    Gruner, CM
    PROCEEDINGS OF THE 1998 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-6, 1998, : 1937 - 1940
  • [6] Information-Theoretic Analysis of Neural Coding
    Don H. Johnson
    Charlotte M. Gruner
    Keith Baggerly
    Chandran Seshagiri
    Journal of Computational Neuroscience, 2001, 10 : 47 - 69
  • [7] Information-theoretic analysis of neural coding
    Johnson, DH
    Gruner, CM
    Baggerly, K
    Seshagiri, C
    JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2001, 10 (01) : 47 - 69
  • [8] Information-Theoretic Analysis of MIMO Channel Sounding
    Baum, Daniel S.
    Boelcskei, Helmut
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2011, 57 (11) : 7555 - 7577
  • [9] Emergence of genetic coding: An information-theoretic model
    Piraveenan, Mahendra
    Polani, Daniel
    Prokopenko, Mikhail
    ADVANCES IN ARTIFICIAL LIFE, PROCEEDINGS, 2007, 4648 : 42 - +
  • [10] An information-theoretic approach to cochlear implant coding
    Morse, RP
    Allingham, D
    Stocks, NG
    UNSOLVED PROBLEMS OF NOISE AND FLUCTUATIONS, 2003, 665 : 125 - 132