Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models

被引:1
|
作者
Xu, Yizhou [1 ]
Hou, TianQi [2 ]
Liang, ShanSuo [2 ]
Mondelli, Marco [3 ]
机构
[1] Tsinghua Univ, Sch Aerosp Engn, Beijing, Peoples R China
[2] Huawei Technol Co Ltd, Cent Res Inst, 2012 Labs, Theory Lab, Shenzhen, Peoples R China
[3] Inst Sci & Technol Austria ISTA, Klosterneuburg, Austria
关键词
D O I
10.1109/ITW55543.2023.10160238
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the problem of reconstructing the signal and the hidden variables from observations coming from a multi-layer network with rotationally invariant weight matrices. The multi-layer structure models inference from deep generative priors, and the rotational invariance imposed on the weights generalizes the i.i.d. Gaussian assumption by allowing for a complex correlation structure, which is typical in applications. In this work, we present a new class of approximate message passing (AMP) algorithms and give a state evolution recursion which precisely characterizes their performance in the large system limit. In contrast with the existing multi-layer VAMP (ML-VAMP) approach, our proposed AMP - dubbed multi-layer rotationally invariant generalized AMP (ML-RI-GAMP) - provides a natural generalization beyond Gaussian designs, in the sense that it recovers the existing Gaussian AMP as a special case. Furthermore, ML-RI-GAMP exhibits a significantly lower complexity than ML-VAMP, as the computationally intensive singular value decomposition is replaced by an estimation of the moments of the design matrices. Finally, our numerical results show that this complexity gain comes at little to no cost in the performance of the algorithm.
引用
收藏
页码:294 / 298
页数:5
相关论文
共 50 条
  • [1] Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing
    Venkataramanan, Ramji
    Koegler, Kevin
    Mondelli, Marco
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [2] PCA Initialization for Approximate Message Passing in Rotationally Invariant Models
    Mondelli, Marco
    Venkataramanan, Ramji
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Multi-Layer Bilinear Generalized Approximate Message Passing
    Zou, Qiuyun
    Zhang, Haochuan
    Yang, Hongwen
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 4529 - 4543
  • [4] APPROXIMATE MESSAGE PASSING ALGORITHMS FOR ROTATIONALLY INVARIANT MATRICES
    Fan, Zhou
    [J]. ANNALS OF STATISTICS, 2022, 50 (01): : 197 - 224
  • [5] A Non-Asymptotic Analysis of Generalized Vector Approximate Message Passing Algorithms With Rotationally Invariant Designs
    Cademartori, Collin
    Rush, Cynthia
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (08) : 5811 - 5856
  • [6] Swept Approximate Message Passing for Sparse Estimation
    Manoel, Andre
    Krzakala, Florent
    Tramel, Eric W.
    Zdeborova, Lenka
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1123 - 1132
  • [7] Matrix inference and estimation in multi-layer models*
    Pandit, Parthe
    Sahraee-Ardakan, Mojtaba
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2021, 2021 (12):
  • [8] Matrix Inference and Estimation in Multi-Layer Models
    Pandit, Parthe
    Sahraee-Ardakan, Mojtaba
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [9] CONSISTENT PARAMETER ESTIMATION FOR LASSO AND APPROXIMATE MESSAGE PASSING
    Mousavi, Ali
    Maleki, Arian
    Baraniuk, Richard G.
    [J]. ANNALS OF STATISTICS, 2018, 46 (01): : 119 - 148
  • [10] CONSISTENT PARAMETER ESTIMATION FOR LASSO AND APPROXIMATE MESSAGE PASSING
    Mousavi, Ali
    Maleki, Arian
    Baraniuk, Richard G.
    [J]. ANNALS OF STATISTICS, 2017, 45 (06): : 2427 - 2454