Lossy Compression via Sparse Regression Codes: An Approximate Message Passing Approach

被引:0
|
作者
Wu, Huihui [1 ]
Wang, Wenjie [1 ]
Liang, Shansuo [1 ]
Han, Wei [1 ]
Bai, Bo [1 ]
机构
[1] Huawei Tech Investment Co Ltd, Theory Lab, 2012 Labs, Hong Kong, Peoples R China
关键词
Lossy compression; sparse regression codes; approximate message passing; Gaussian rate-distortion; QUANTIZATION; PERFORMANCE;
D O I
10.1109/ITW55543.2023.10161626
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents a low-complexity lossy compression scheme for Gaussian vectors, using sparse regression codes (SRC) and a novel decimated approximate message passing (AMP) encoder. The sparse regression codebook is characterized by a design matrix and each codeword is a linear combination of selected columns of the matrix. In order to enable the convergence of AMP for lossy compression, we incorporate the concept of decimation into the AMP algorithm for the first time. Further, we show that the power allocation technique is beneficial for improving the rate-distortion performance. The computational complexity of the proposed encoding is O(log n) per source sample for a length-n source vector, using a sub-Fourier design matrix. Moreover, the proposed AMP encoder inherently supports successively refinable compression. Simulation results show that the proposed decimated AMP encoder significantly outperforms the existing successive-approximation encoding [1] and approaches the rate-distortion limit in low-rate regime.
引用
收藏
页码:288 / 293
页数:6
相关论文
共 50 条
  • [1] Capacity-achieving Sparse Regression Codes via Approximate Message Passing Decoding
    Rush, Cynthia
    Greig, Adam
    Venkataramanan, Ramji
    [J]. 2015 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2015, : 2016 - 2020
  • [2] Sparse Multinomial Logistic Regression via Approximate Message Passing
    Byrne, Evan
    Schniter, Philip
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (21) : 5485 - 5498
  • [3] Mixed Regression via Approximate Message Passing
    Tan, Nelvin
    Venkataramanan, Ramji
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [4] Capacity-Achieving Sparse Superposition Codes via Approximate Message Passing Decoding
    Rush, Cynthia
    Greig, Adam
    Venkataramanan, Ramji
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (03) : 1476 - 1500
  • [5] MULTI-PROCESSOR APPROXIMATE MESSAGE PASSING USING LOSSY COMPRESSION
    Han, Puxiao
    Zhu, Junan
    Niu, Ruixin
    Baron, Dror
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 6240 - 6244
  • [6] The Error Probability of Sparse Superposition Codes With Approximate Message Passing Decoding
    Rush, Cynthia
    Venkataramanan, Ramji
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (05) : 3278 - 3303
  • [7] Approximate Message-Passing Decoder and Capacity Achieving Sparse Superposition Codes
    Barbier, Jean
    Krzakala, Florent
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (08) : 4894 - 4927
  • [8] Generalized Approximate Message-Passing Decoder for Universal Sparse Superposition Codes
    Biyik, Erdem
    Barbier, Jean
    Dia, Mohamad
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 1593 - 1597
  • [9] Swept Approximate Message Passing for Sparse Estimation
    Manoel, Andre
    Krzakala, Florent
    Tramel, Eric W.
    Zdeborova, Lenka
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1123 - 1132
  • [10] Asymptotic Statistical Analysis of Sparse Group LASSO via Approximate Message Passing
    Chen, Kan
    Bu, Zhiqi
    Xu, Shiyun
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 510 - 526