The sparsity and bias of the lasso selection in high-dimensional linear regression

被引:501
|
作者
Zhang, Cun-Hui [1 ]
Huang, Jian [2 ]
机构
[1] Rutgers State Univ, Dept Stat, Hill Ctr, Piscataway, NJ 08854 USA
[2] Univ Iowa, Dept Stat & Actuarial Sci, Iowa City, IA 52242 USA
来源
ANNALS OF STATISTICS | 2008年 / 36卷 / 04期
关键词
penalized regression; high-dimensional data; variable selection; bias; rate consistency; spectral analysis; random matrices;
D O I
10.1214/07-AOS520
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Meinshausen and Buhlmann [Ann. Statist. 34 (2006) 1436-1462] showed that, for neighborhood selection in Gaussian graphical models, under a neighborhood stability condition, the LASSO is consistent, even when the number of variables is of greater order than the sample size. Zhao and Yu [(2006) J. Machine Learning Research 7 2541-2567] formalized the neighborhood stability condition in the context of linear regression as a strong irrepresentable condition. That paper showed that under this condition, the LASSO selects exactly the set of nonzero regression coefficients, provided that these coefficients are bounded away from zero at a certain rate. In this paper, the regression coefficients outside an ideal model are assumed to be small, but not necessarily zero. Under a sparse Riesz condition on the correlation of design variables, we prove that the LASSO selects a model of the correct order of dimensionality, controls the bias of the selected model at a level determined by the contributions of small regression coefficients and threshold bias, and selects all coefficients of greater order than the bias of the selected model. Moreover, as a consequence of this rate consistency of the LASSO in model selection, it is proved that the sum of error squares for the mean response and the l(alpha)-loss for the regression coefficients converge at the best possible rates under the given conditions. An interesting aspect of our results is that the logarithm of the number of variables can be of the same order as the sample size for certain random dependent designs.
引用
下载
收藏
页码:1567 / 1594
页数:28
相关论文
共 50 条
  • [21] ASYMPTOTIC ANALYSIS OF HIGH-DIMENSIONAL LAD REGRESSION WITH LASSO
    Gao, Xiaoli
    Huang, Jian
    STATISTICA SINICA, 2010, 20 (04) : 1485 - 1506
  • [22] Robust adaptive LASSO in high-dimensional logistic regression
    Basu, Ayanendranath
    Ghosh, Abhik
    Jaenada, Maria
    Pardo, Leandro
    STATISTICAL METHODS AND APPLICATIONS, 2024,
  • [23] A study on tuning parameter selection for the high-dimensional lasso
    Homrighausen, Darren
    McDonald, Daniel J.
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2018, 88 (15) : 2865 - 2892
  • [24] Variable selection in high-dimensional sparse multiresponse linear regression models
    Luo, Shan
    STATISTICAL PAPERS, 2020, 61 (03) : 1245 - 1267
  • [25] Variable selection in high-dimensional sparse multiresponse linear regression models
    Shan Luo
    Statistical Papers, 2020, 61 : 1245 - 1267
  • [26] Fully Bayesian logistic regression with hyper-LASSO priors for high-dimensional feature selection
    Li, Longhai
    Yao, Weixin
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2018, 88 (14) : 2827 - 2851
  • [27] An Improved Forward Regression Variable Selection Algorithm for High-Dimensional Linear Regression Models
    Xie, Yanxi
    Li, Yuewen
    Xia, Zhijie
    Yan, Ruixia
    IEEE ACCESS, 2020, 8 (08): : 129032 - 129042
  • [28] The joint lasso: high-dimensional regression for group structured data
    Dondelinger, Frank
    Mukherjee, Sach
    BIOSTATISTICS, 2020, 21 (02) : 219 - 235
  • [29] Linear and nonlinear signal detection and estimation in high-dimensional nonparametric regression under weak sparsity
    Cheung, Kin yap
    Lee, Stephen m. s.
    Xu, Xiaoya
    BERNOULLI, 2024, 30 (01) : 636 - 665
  • [30] Pathway Lasso: pathway estimation and selection with high-dimensional mediators
    Zhao, Yi
    Luo, Xi
    STATISTICS AND ITS INTERFACE, 2022, 15 (01) : 39 - 50