SCAD-Penalized Least Absolute Deviation Regression in High-Dimensional Models

被引:19
|
作者
Wang, Mingqiu [1 ,2 ]
Song, Lixin [1 ]
Tian, Guo-Liang [3 ]
机构
[1] Dalian Univ Technol, Sch Math Sci, Dalian 116023, Liaoning, Peoples R China
[2] Qufu Normal Univ, Sch Stat, Qufu, Peoples R China
[3] Univ Hong Kong, Dept Stat & Actuarial Sci, Hong Kong, Hong Kong, Peoples R China
基金
中国博士后科学基金;
关键词
Empirical process; LAD-SCAD estimator; Oracle property; Rank correlation screening; Stochastic equicontinuity; Variable selection; VARIABLE SELECTION; ASYMPTOTIC-BEHAVIOR; ROBUST REGRESSION; DIVERGING NUMBER; M-ESTIMATORS; PARAMETERS; LIKELIHOOD; LASSO; SHRINKAGE; P2/N;
D O I
10.1080/03610926.2013.781643
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
When outliers and/or heavy-tailed errors exist in linear models, the least absolute deviation (LAD) regression is a robust alternative to the ordinary least squares regression. Existing variable-selection methods in linear models based on LAD regression either only consider the finite number of predictors or lack the oracle property associated with the estimator. In this article, we focus on the variable selection via LAD regression with a diverging number of parameters. The rate of convergence of the LAD estimator with the smoothly clipped absolute deviation (SCAD) penalty function is established. Furthermore, we demonstrate that, under certain regularity conditions, the penalized estimator with a properly selected tuning parameter enjoys the oracle property. In addition, the rank correlation screening method originally proposed by Li et al. (2011) is applied to deal with ultrahigh dimensional data. Simulation studies are conducted for revealing the finite sample performance of the estimator. We further illustrate the proposed methodology by a real example.
引用
收藏
页码:2452 / 2472
页数:21
相关论文
共 50 条
  • [41] Least Absolute Deviation Estimation for Regression with ARMA Errors
    Richard A. Davis
    William T. M. Dunsmuir
    [J]. Journal of Theoretical Probability, 1997, 10 : 481 - 497
  • [42] A Maximum Likelihood Approach to Least Absolute Deviation Regression
    Yinbo Li
    Gonzalo R. Arce
    [J]. EURASIP Journal on Advances in Signal Processing, 2004
  • [43] Fuzzy regression using least absolute deviation estimators
    Seung Hoe Choi
    James J. Buckley
    [J]. Soft Computing, 2008, 12 : 257 - 263
  • [44] Model selection and estimation in high dimensional regression models with group SCAD
    Guo, Xiao
    Zhang, Hai
    Wang, Yao
    Wu, Jiang-Lun
    [J]. STATISTICS & PROBABILITY LETTERS, 2015, 103 : 86 - 92
  • [45] Variance estimation for high-dimensional regression models
    Spokoiny, V
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2002, 82 (01) : 111 - 133
  • [46] Localizing Changes in High-Dimensional Regression Models
    Rinaldo, Alessandro
    Wang, Daren
    Wen, Qin
    Willett, Rebecca
    Yu, Yi
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [47] HIGH-DIMENSIONAL GENERALIZATIONS OF ASYMMETRIC LEAST SQUARES REGRESSION AND THEIR APPLICATIONS
    Gu, Yuwen
    Zou, Hui
    [J]. ANNALS OF STATISTICS, 2016, 44 (06): : 2661 - 2694
  • [48] Consistent tuning parameter selection in high-dimensional group-penalized regression
    Yaguang Li
    Yaohua Wu
    Baisuo Jin
    [J]. Science China Mathematics, 2019, 62 (04) : 751 - 770
  • [49] Targeted Inference Involving High-Dimensional Data Using Nuisance Penalized Regression
    Sun, Qiang
    Zhang, Heping
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2021, 116 (535) : 1472 - 1486
  • [50] Consistent tuning parameter selection in high-dimensional group-penalized regression
    Yaguang Li
    Yaohua Wu
    Baisuo Jin
    [J]. Science China Mathematics, 2019, 62 : 751 - 770