Vector Approximate Message Passing for the Generalized Linear Model

被引:0
|
作者
Schniter, Philip [1 ]
Rangan, Sundeep [2 ]
Fletcher, Alyson K. [3 ,4 ,5 ]
机构
[1] Ohio State Univ, Dept ECE, Columbus, OH 43210 USA
[2] NYU, Dept Elect & Comp Engn, Brooklyn, NY 11201 USA
[3] Univ Calif Los Angeles, Dept Stat, Los Angeles, CA 90095 USA
[4] Univ Calif Los Angeles, Dept Math, Los Angeles, CA 90095 USA
[5] Univ Calif Los Angeles, Dept Elect Engn, Los Angeles, CA 90095 USA
基金
美国国家科学基金会;
关键词
ALGORITHMS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The generalized linear model (GLM), where a random vector x is observed through a noisy, possibly nonlinear, function of a linear transform output z = Ax, arises in a range of applications such as robust regression, binary classification, quantized compressed sensing, phase retrieval, photon-limited imaging, and inference from neural spike trains. When A is large and i.i.d. Gaussian, the generalized approximate message passing (GAMP) algorithm is an efficient means of MAP or marginal inference, and its performance can be rigorously characterized by a scalar state evolution. For general A, though, GAMP can misbehave. Damping and sequential-updating help to robustify GAMP, but their effects are limited. Recently, a "vector AMP" (VAMP) algorithm was proposed for additive white Gaussian noise channels. VAMP extends AMP's guarantees from i.i.d. Gaussian A to the larger class of rotationally invariant A. In this paper, we show how VAMP can be extended to the GLM. Numerical experiments show that the proposed GLM-VAMP is much more robust to ill-conditioning in A than damped GAMP.
引用
收藏
页码:1525 / 1529
页数:5
相关论文
共 50 条
  • [1] Generalized Memory Approximate Message Passing for Generalized Linear Model
    Tian, Feiyan
    Liu, Lei
    Chen, Xiaoming
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 6404 - 6418
  • [2] Generalized Unitary Approximate Message Passing for Double Linear Transformation Model
    Mo, Linlin
    Lu, Xinhua
    Yuan, Jide
    Zhang, Chuanzong
    Wang, Zhongyong
    Popovski, Petar
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 1524 - 1538
  • [3] Bilinear Adaptive Generalized Vector Approximate Message Passing
    Meng, Xiangming
    Zhu, Jiang
    [J]. IEEE ACCESS, 2019, 7 : 4807 - 4815
  • [4] Vector Approximate Message Passing
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (10) : 6664 - 6684
  • [5] Vector Approximate Message Passing
    Rangan, Sundeep
    Schniter, Philip
    Fletcher, Alyson K.
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2017, : 1588 - 1592
  • [6] Approximate message passing with spectral initialization for generalized linear models*
    Mondelli, Marco
    Venkataramanan, Ramji
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2022, 2022 (11):
  • [7] Generalized Approximate Message Passing for Estimation with Random Linear Mixing
    Rangan, Sundeep
    [J]. 2011 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2011,
  • [8] Approximate Message Passing with Spectral Initialization for Generalized Linear Models
    Mondelli, Marco
    Venkataramanan, Ramji
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130 : 397 - +
  • [9] UPSCALING VECTOR APPROXIMATE MESSAGE PASSING
    Skuratovs, Nikolajs
    Davies, Michael
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4757 - 4761
  • [10] An Expectation-Maximization Approach to Tuning Generalized Vector Approximate Message Passing
    Metzler, Christopher A.
    Schniter, Philip
    Baraniuk, Richard G.
    [J]. LATENT VARIABLE ANALYSIS AND SIGNAL SEPARATION (LVA/ICA 2018), 2018, 10891 : 395 - 406