APPROXIMATE MESSAGE PASSING ALGORITHMS FOR ROTATIONALLY INVARIANT MATRICES

被引:23
|
作者
Fan, Zhou [1 ]
机构
[1] Yale Univ, Dept Stat & Data Sci, New Haven, CT 06520 USA
来源
ANNALS OF STATISTICS | 2022年 / 50卷 / 01期
关键词
AMP; free probability theory; high-dimensional asymptotics; PCA; RECTANGULAR RANDOM MATRICES; STATE EVOLUTION; LASSO; INFORMATION;
D O I
10.1214/21-AOS2101
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Approximate Message Passing (AMP) algorithms have seen widespread use across a variety of applications. However, the precise forms for their Onsager corrections and state evolutions depend on properties of the underlying random matrix ensemble, limiting the extent to which AMP algorithms derived for white noise may be applicable to data matrices that arise in practice. In this work, we study more general AMP algorithms for random matrices W that satisfy orthogonal rotational invariance in law, where W may have a spectral distribution that is different from the semicircle and Marcenko- Pastur laws characteristic of white noise. The Onsager corrections and state evolutions in these algorithms are defined by the free cumulants or rectangular free cumulants of the spectral distribution of W. Their forms were derived previously by Opper, Cakmak and Winther using nonrigorous dynamic functional theory techniques, and we provide rigorous proofs. Our motivating application is a Bayes-AMP algorithm for Principal Components Analysis, when there is prior structure for the principal components (PCs) and possibly nonwhite noise. For sufficiently large signal strengths and any non-Gaussian prior distributions for the PCs, we show that this algorithm provably achieves higher estimation accuracy than the sample PCs.
引用
收藏
页码:197 / 224
页数:28
相关论文
共 50 条
  • [1] PCA Initialization for Approximate Message Passing in Rotationally Invariant Models
    Mondelli, Marco
    Venkataramanan, Ramji
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] A Non-Asymptotic Analysis of Generalized Vector Approximate Message Passing Algorithms With Rotationally Invariant Designs
    Cademartori, Collin
    Rush, Cynthia
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (08) : 5811 - 5856
  • [3] Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models
    Xu, Yizhou
    Hou, TianQi
    Liang, ShanSuo
    Mondelli, Marco
    [J]. 2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 294 - 298
  • [4] Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing
    Venkataramanan, Ramji
    Koegler, Kevin
    Mondelli, Marco
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [5] Universality of approximate message passing algorithms
    Chen, Wei-Kuo
    Lam, Wai-Kit
    [J]. ELECTRONIC JOURNAL OF PROBABILITY, 2021, 26
  • [6] On the Convergence of Approximate Message Passing With Arbitrary Matrices
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    Sarkar, Subrata
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (09) : 5339 - 5351
  • [7] On the Convergence of Approximate Message Passing with Arbitrary Matrices
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson
    [J]. 2014 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2014, : 236 - 240
  • [8] Dynamics of Damped Approximate Message Passing Algorithms
    Mimura, Kazushi
    Takeuchi, Jun'ichi
    [J]. 2019 IEEE INFORMATION THEORY WORKSHOP (ITW), 2019, : 564 - 568
  • [9] Finite Sample Analysis of Approximate Message Passing Algorithms
    Rush, Cynthia
    Venkataramanan, Ramji
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2018, 64 (11) : 7264 - 7286
  • [10] UNIVERSALITY OF APPROXIMATE MESSAGE PASSING ALGORITHMS AND TENSOR NETWORKS
    Wang, Tianhao
    Zhong, Xinyi
    Fan, Zhou
    [J]. ANNALS OF APPLIED PROBABILITY, 2024, 34 (04): : 3943 - 3994