Asymptotic Statistical Analysis of Sparse Group LASSO via Approximate Message Passing

被引:0
|
作者
Chen, Kan [1 ]
Bu, Zhiqi [1 ]
Xu, Shiyun [1 ]
机构
[1] Univ Penn, Grad Grp Appl Math & Computat Sci, Philadelphia, PA 19104 USA
关键词
THRESHOLDING ALGORITHM; SHRINKAGE; SELECTION;
D O I
10.1007/978-3-030-86523-8_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sparse Group LASSO (SGL) is a regularized model for high-dimensional linear regression problems with grouped covariates. SGL applies l(1) and l(2) penalties on the individual predictors and group predictors, respectively, to guarantee sparse effects both on the inter-group and within-group levels. In this paper, we apply the approximate message passing (AMP) algorithm to efficiently solve the SGL problem under Gaussian random designs. We further use the recently developed state evolution analysis of AMP to derive an asymptotically exact characterization of SGL solution. This allows us to conduct multiple fine-grained statistical analyses of SGL, through which we investigate the effects of the group information and gamma (proportion of l(1) penalty). With the lens of various performance measures, we show that SGL with small gamma benefits significantly from the group information and can outperform other SGL (including LASSO) or regularized models which does not exploit the group information, in terms of the recovery rate of signal, false discovery rate and mean squared error.
引用
下载
收藏
页码:510 / 526
页数:17
相关论文
共 50 条
  • [1] Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
    Maleki, Arian
    Anitori, Laura
    Yang, Zai
    Baraniuk, Richard G.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2013, 59 (07) : 4290 - 4308
  • [2] Approximate message passing for nonconvex sparse regularization with stability and asymptotic analysis
    Sakata, Ayaka
    Xu, Yingying
    JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2018,
  • [3] Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing
    Bu, Zhiqi
    Klusowski, Jason M.
    Rush, Cynthia
    Su, Weijie
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Algorithmic Analysis and Statistical Estimation of SLOPE via Approximate Message Passing
    Bu, Zhiqi
    Klusowski, Jason M.
    Rush, Cynthia
    Su, Weijie J.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2021, 67 (01) : 506 - 537
  • [5] Sparse Multinomial Logistic Regression via Approximate Message Passing
    Byrne, Evan
    Schniter, Philip
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2016, 64 (21) : 5485 - 5498
  • [6] CONSISTENT PARAMETER ESTIMATION FOR LASSO AND APPROXIMATE MESSAGE PASSING
    Mousavi, Ali
    Maleki, Arian
    Baraniuk, Richard G.
    ANNALS OF STATISTICS, 2018, 46 (01): : 119 - 148
  • [7] CONSISTENT PARAMETER ESTIMATION FOR LASSO AND APPROXIMATE MESSAGE PASSING
    Mousavi, Ali
    Maleki, Arian
    Baraniuk, Richard G.
    ANNALS OF STATISTICS, 2017, 45 (06): : 2427 - 2454
  • [8] Swept Approximate Message Passing for Sparse Estimation
    Manoel, Andre
    Krzakala, Florent
    Tramel, Eric W.
    Zdeborova, Lenka
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1123 - 1132
  • [9] Lossy Compression via Sparse Regression Codes: An Approximate Message Passing Approach
    Wu, Huihui
    Wang, Wenjie
    Liang, Shansuo
    Han, Wei
    Bai, Bo
    2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 288 - 293
  • [10] Computationally Efficient Sparse Bayesian Learning via Generalized Approximate Message Passing
    Zou, Xianbing
    Li, Fuwei
    Fang, Jun
    Li, Hongbin
    2016 IEEE INTERNATIONAL CONFERENCE ON UBIQUITOUS WIRELESS BROADBAND (ICUWB2016), 2016,