A Non-Asymptotic Analysis of Generalized Vector Approximate Message Passing Algorithms With Rotationally Invariant Designs

被引:0
|
作者
Cademartori, Collin [1 ]
Rush, Cynthia [1 ]
机构
[1] Columbia Univ, Dept Stat, New York, NY 10027 USA
关键词
Vectors; Estimation; Message passing; Particle measurements; Linear regression; Data models; Atmospheric measurements; IEEE; IEEEtran; journal; LATEX; paper; template; STATE EVOLUTION; DYNAMICS;
D O I
10.1109/TIT.2024.3396472
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Approximate Message Passing (AMP) algorithms are a class of iterative procedures for computationally-efficient estimation in high-dimensional inference and estimation tasks. Due to the presence of an 'Onsager' correction term in its iterates, for N x M design matrices A with i.i.d. Gaussian entries, the asymptotic distribution of the estimate at any iteration of the algorithm can be exactly characterized in the large system limit as M/N -> delta is an element of (0 , infinity ) via a scalar recursion referred to as state evolution. In this paper, we show that appropriate functionals of the iterates, in fact, concentrate around their limiting values predicted by these asymptotic distributions with rates exponentially fast in N for a large class of AMP-style algorithms, including those that are used when high-dimensional generalized linear regression models are assumed to be the data-generating process, like the generalized AMP algorithm, or those that are used when the measurement matrix is assumed to be rotationally invariant instead of i.i.d. Gaussian, like vector AMP and generalized vector AMP. In practice, these more general AMP algorithms have many applications, for example in communications or imaging, and this work provides the first study of finite sample behavior of such algorithms.
引用
收藏
页码:5811 / 5856
页数:46
相关论文
共 24 条
  • [1] APPROXIMATE MESSAGE PASSING ALGORITHMS FOR ROTATIONALLY INVARIANT MATRICES
    Fan, Zhou
    [J]. ANNALS OF STATISTICS, 2022, 50 (01): : 197 - 224
  • [2] Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing
    Venkataramanan, Ramji
    Koegler, Kevin
    Mondelli, Marco
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [3] PCA Initialization for Approximate Message Passing in Rotationally Invariant Models
    Mondelli, Marco
    Venkataramanan, Ramji
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Vector Approximate Message Passing for the Generalized Linear Model
    Schniter, Philip
    Rangan, Sundeep
    Fletcher, Alyson K.
    [J]. 2016 50TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS AND COMPUTERS, 2016, : 1525 - 1529
  • [5] Bilinear Adaptive Generalized Vector Approximate Message Passing
    Meng, Xiangming
    Zhu, Jiang
    [J]. IEEE ACCESS, 2019, 7 : 4807 - 4815
  • [6] Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models
    Xu, Yizhou
    Hou, TianQi
    Liang, ShanSuo
    Mondelli, Marco
    [J]. 2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 294 - 298
  • [7] Finite Sample Analysis of Approximate Message Passing Algorithms
    Rush, Cynthia
    Venkataramanan, Ramji
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (11) : 7264 - 7286
  • [8] Analysis of random sequential message passing algorithms for approximate inference
    Cakmak, Burak
    Lu, Yue M.
    Opper, Manfred
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (07):
  • [9] GENERALIZED APPROXIMATE MESSAGE PASSING FOR COSPARSE ANALYSIS COMPRESSIVE SENSING
    Borgerding, Mark
    Schniter, Philip
    Vila, Jeremy
    Rangan, Sundeep
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 3756 - 3760
  • [10] An Expectation-Maximization Approach to Tuning Generalized Vector Approximate Message Passing
    Metzler, Christopher A.
    Schniter, Philip
    Baraniuk, Richard G.
    [J]. LATENT VARIABLE ANALYSIS AND SIGNAL SEPARATION (LVA/ICA 2018), 2018, 10891 : 395 - 406