The sparsity and bias of the lasso selection in high-dimensional linear regression

被引:501
|
作者
Zhang, Cun-Hui [1 ]
Huang, Jian [2 ]
机构
[1] Rutgers State Univ, Dept Stat, Hill Ctr, Piscataway, NJ 08854 USA
[2] Univ Iowa, Dept Stat & Actuarial Sci, Iowa City, IA 52242 USA
来源
ANNALS OF STATISTICS | 2008年 / 36卷 / 04期
关键词
penalized regression; high-dimensional data; variable selection; bias; rate consistency; spectral analysis; random matrices;
D O I
10.1214/07-AOS520
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Meinshausen and Buhlmann [Ann. Statist. 34 (2006) 1436-1462] showed that, for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition, the LASSO is consistent, even when the number of variables is of greater order than the sample size. Zhao and Yu [(2006) J. Machine Learning Research 7 2541-2567] formalized the neighborhood stability condition in the context of linear regression as a strong irrepresentable condition. That paper showed that under this condition, the LASSO selects exactly the set of nonzero regression coefficients, provided that these coefficients are bounded away from zero at a certain rate. In this paper, the regression coefficients outside an ideal model are assumed to be small, but not necessarily zero. Under a sparse Riesz condition on the correlation of design variables, we prove that the LASSO selects a model of the correct order of dimensionality, controls the bias of the selected model at a level determined by the contributions of small regression coefficients and threshold bias, and selects all coefficients of greater order than the bias of the selected model. Moreover, as a consequence of this rate consistency of the LASSO in model selection, it is proved that the sum of error squares for the mean response and the l(alpha)-loss for the regression coefficients converge at the best possible rates under the given conditions. An interesting aspect of our results is that the logarithm of the number of variables can be of the same order as the sample size for certain random dependent designs.
引用
下载
收藏
页码:1567 / 1594
页数:28
相关论文
共 50 条
  • [31] An Additive Sparse Penalty for Variable Selection in High-Dimensional Linear Regression Model
    Lee, Sangin
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2015, 22 (02) : 147 - 157
  • [32] BOSO: A novel feature selection algorithm for linear regression with high-dimensional data
    Valcarcel, Luis J.
    San Jose-Eneriz, Edurne L.
    Cendoya, Xabier
    Rubio, Angel L.
    Agirre, Xabier
    Prosper, Felipe L.
    Planes, Francisco
    PLOS COMPUTATIONAL BIOLOGY, 2022, 18 (05)
  • [33] High-dimensional linear discriminant analysis with moderately clipped LASSO
    Chang, Jaeho
    Moon, Haeseong
    Kwon, Sunghoon
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2021, 28 (01) : 21 - 37
  • [34] Moderately clipped LASSO for the high-dimensional generalized linear model
    Lee, Sangin
    Ku, Boncho
    Kown, Sunghoon
    COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2020, 27 (04) : 445 - 458
  • [35] Hi-LASSO: High-Dimensional LASSO
    Kim, Youngsoon
    Hao, Jie
    Mallavarapu, Tejaswini
    Park, Joongyang
    Kang, Mingon
    IEEE ACCESS, 2019, 7 : 44562 - 44573
  • [36] Adaptive group Lasso for high-dimensional generalized linear models
    Wang, Mingqiu
    Tian, Guo-Liang
    STATISTICAL PAPERS, 2019, 60 (05) : 1469 - 1486
  • [37] Asymptotic properties of Lasso in high-dimensional partially linear models
    Ma Chi
    Huang Jian
    SCIENCE CHINA-MATHEMATICS, 2016, 59 (04) : 769 - 788
  • [38] Overlapping group lasso for high-dimensional generalized linear models
    Zhou, Shengbin
    Zhou, Jingke
    Zhang, Bo
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2019, 48 (19) : 4903 - 4917
  • [39] Asymptotic properties of Lasso in high-dimensional partially linear models
    MA Chi
    HUANG Jian
    Science China Mathematics, 2016, 59 (04) : 769 - 788
  • [40] Asymptotic properties of Lasso in high-dimensional partially linear models
    Chi Ma
    Jian Huang
    Science China Mathematics, 2016, 59 : 769 - 788