Asymptotic properties of Lasso plus mLS and Lasso plus Ridge in sparse high-dimensional linear regression

被引:48
|
作者
Liu, Hanzhong [1 ]
机构
[1] Peking Univ, Sch Math Sci, Beijing 100871, Peoples R China
来源
基金
美国国家科学基金会;
关键词
Lasso; irrepresentable condition; Lasso-fmLS and Lasso plus Ridge; sparsity; asymptotic unbiasedness; asymptotic normality; residual bootstrap; MODEL SELECTION CONSISTENCY; VARIABLE SELECTION; ADAPTIVE LASSO; RECOVERY; REPRESENTATIONS;
D O I
10.1214/14-EJS875
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We study the asymptotic properties of Lasso+mLS and Lasso+ Ridge under the sparse high-dimensional linear regression model: Lasso selecting predictors and then modified Least Squares (mLS) or Ridge estimating their coefficients. First, we propose a valid inference procedure for parameter estimation based on parametric residual bootstrap after Lasso+ rriLS and Lasso+Ridge. Second, we derive the asymptotic unbiasedness of Lasso rilLS and Lasso Ridge. More specifically, we show that their biases decay at an exponential rate and they can achieve the oracle convergence rate of sln. (where s is the number of nonzero regression coefficients and 12 is the sample size) for mean squared error (MSE). Third, we show that Lasso mLS and Lasso Ridge are asymptotically normal. They have an oracle property in the sense that they can select the true predictors with probability converging to 1 and the estimates of nonzero parameters have the same asymptotic normal distribution that they would have if the zero parameters were known in advance. In fact, our analysis is not limited to adopting Lasso in the selection stage, but is applicable to any other model selection criteria with exponentially decay rates of the probability of selecting wrong models.
引用
收藏
页码:3124 / 3169
页数:46
相关论文
共 50 条
  • [1] A BOOTSTRAP LASSO plus PARTIAL RIDGE METHOD TO CONSTRUCT CONFIDENCE INTERVALS FOR PARAMETERS IN HIGH-DIMENSIONAL SPARSE LINEAR MODELS
    Liu, Hanzhong
    Xu, Xin
    Li, Jingyi Jessica
    [J]. STATISTICA SINICA, 2020, 30 (03) : 1333 - 1355
  • [2] Asymptotic properties of Lasso in high-dimensional partially linear models
    Ma Chi
    Huang Jian
    [J]. SCIENCE CHINA-MATHEMATICS, 2016, 59 (04) : 769 - 788
  • [3] Asymptotic properties of Lasso in high-dimensional partially linear models
    MA Chi
    HUANG Jian
    [J]. Science China Mathematics, 2016, 59 (04) : 769 - 788
  • [4] Asymptotic properties of Lasso in high-dimensional partially linear models
    Chi Ma
    Jian Huang
    [J]. Science China Mathematics, 2016, 59 : 769 - 788
  • [5] ADAPTIVE LASSO FOR SPARSE HIGH-DIMENSIONAL REGRESSION MODELS
    Huang, Jian
    Ma, Shuangge
    Zhang, Cun-Hui
    [J]. STATISTICA SINICA, 2008, 18 (04) : 1603 - 1618
  • [6] ASYMPTOTIC ANALYSIS OF HIGH-DIMENSIONAL LAD REGRESSION WITH LASSO
    Gao, Xiaoli
    Huang, Jian
    [J]. STATISTICA SINICA, 2010, 20 (04) : 1485 - 1506
  • [7] Spline-Lasso in High-Dimensional Linear Regression
    Guo, Jianhua
    Hu, Jianchang
    Jing, Bing-Yi
    Zhang, Zhen
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2016, 111 (513) : 288 - 297
  • [8] Localized Lasso for High-Dimensional Regression
    Yamada, Makoto
    Takeuchi, Koh
    Iwata, Tomoharu
    Shawe-Taylor, John
    Kaski, Samuel
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 325 - 333
  • [9] The sparsity and bias of the lasso selection in high-dimensional linear regression
    Zhang, Cun-Hui
    Huang, Jian
    [J]. ANNALS OF STATISTICS, 2008, 36 (04): : 1567 - 1594
  • [10] Influence Diagnostics for High-Dimensional Lasso Regression
    Rajaratnam, Bala
    Roberts, Steven
    Sparks, Doug
    Yu, Honglin
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2019, 28 (04) : 877 - 890