LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA

被引:430
|
作者
Meinshausen, Nicolai [1 ]
Yu, Bin [2 ]
机构
[1] Univ Oxford, Dept Stat, Oxford OX1 3TG, England
[2] Univ Calif Berkeley, Dept Stat, Berkeley, CA 94720 USA
来源
ANNALS OF STATISTICS | 2009年 / 37卷 / 01期
关键词
Shrinkage estimation; lasso; high-dimensional data; sparsity; MODEL SELECTION; ADAPTIVE LASSO; ASYMPTOTICS; REGRESSION;
D O I
10.1214/07-AOS582
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The Lasso is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables p(n) is potentially much larger than the number of samples n. However, it was recently discovered that the sparsity pattern of the Lasso estimator can only be asymptotically identical to the true sparsity pattern if the design matrix satisfies the so-called irrepresentable condition. The latter condition can easily be violated in the presence of highly correlated variables. Here we examine the behavior of the Lasso estimators if the irrepresentable condition is relaxed. Even though the Lasso cannot recover the correct sparsity pattern, we show that the estimator is still consistent in the l(2)-norm sense for fixed designs under conditions on (a) the number s(n) Of nonzero components of the vector beta(n) and (b) the minimal singular values of design matrices that are induced by selecting small subsets of variables. Furthermore, a rate of convergence result is obtained on the l(2) error with an appropriate choice of the smoothing parameter. The rate is shown to be optimal under the condition of bounded maximal and minimal sparse eigenvalues. Our results imply that, with high probability, all important variables are selected. The set of selected variables is a meaningful reduction on the original set of variables. Finally, our results are illustrated with the detection of closely adjacent frequencies, a problem encountered in astrophysics.
引用
收藏
页码:246 / 270
页数:25
相关论文
共 50 条
  • [1] LASSO-type variable selection methods for high-dimensional data
    Fu, Guanghui
    Wang, Pan
    [J]. ADVANCES IN COMPUTATIONAL MODELING AND SIMULATION, PTS 1 AND 2, 2014, 444-445 : 604 - 609
  • [2] ET-Lasso: A New Efficient Tuning of Lasso-type Regularization for High-Dimensional Data
    Yang, Songshan
    Wen, Jiawei
    Zhan, Xiang
    Kifer, Daniel
    [J]. KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 607 - 616
  • [3] High-dimensional nonconvex LASSO-type M-estimators
    Beyhum, Jad
    Portier, Francois
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 202
  • [4] Sparse Recovery With Unknown Variance: A LASSO-Type Approach
    Chretien, Stephane
    Darses, Sebastien
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (07) : 3970 - 3988
  • [5] Comparison of Lasso Type Estimators for High-Dimensional Data
    Kim, Jaehee
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2014, 21 (04) : 349 - 361
  • [6] ADAPTIVE LASSO FOR SPARSE HIGH-DIMENSIONAL REGRESSION MODELS
    Huang, Jian
    Ma, Shuangge
    Zhang, Cun-Hui
    [J]. STATISTICA SINICA, 2008, 18 (04) : 1603 - 1618
  • [7] The adaptive lasso in high-dimensional sparse heteroscedastic models
    Wagener J.
    Dette H.
    [J]. Mathematical Methods of Statistics, 2013, 22 (2) : 137 - 154
  • [8] On the anonymization of sparse high-dimensional data
    Ghinita, Gabriel
    Tao, Yufei
    Kalnis, Panos
    [J]. 2008 IEEE 24TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING, VOLS 1-3, 2008, : 715 - +
  • [9] Sparse and debiased lasso estimation and inference for high-dimensional composite quantile regression with distributed data
    Zhaohan Hou
    Wei Ma
    Lei Wang
    [J]. TEST, 2023, 32 : 1230 - 1250
  • [10] Interpolation of sparse high-dimensional data
    Thomas C. H. Lux
    Layne T. Watson
    Tyler H. Chang
    Yili Hong
    Kirk Cameron
    [J]. Numerical Algorithms, 2021, 88 : 281 - 313