ET-Lasso: A New Efficient Tuning of Lasso-type Regularization for High-Dimensional Data

被引:5
|
作者
Yang, Songshan [1 ]
Wen, Jiawei [1 ]
Zhan, Xiang [1 ]
Kifer, Daniel [1 ]
机构
[1] Penn State Univ, University Pk, PA 16802 USA
关键词
high-dimensional data; Lasso; automatic tuning parameter selection; feature selection; SPARSE REPRESENTATIONS; VARIABLE SELECTION; CROSS-VALIDATION; REGRESSION; RECOVERY; SHRINKAGE;
D O I
10.1145/3292500.3330910
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The L-1 regularization (Lasso) has proven to be a versatile tool to select relevant features and estimate the model coefficients simultaneously and has been widely used in many research areas such as genomes studies, finance, and biomedical imaging. Despite its popularity, it is very challenging to guarantee the feature selection consistency of Lasso especially when the dimension of the data is huge. One way to improve the feature selection consistency is to select an ideal tuning parameter. Traditional tuning criteria mainly focus on minimizing the estimated prediction error or maximizing the posterior model probability, such as cross-validation and BIC, which may either be time-consuming or fail to control the false discovery rate (FDR) when the number of features is extremely large. The other way is to introduce pseudo-features to learn the importance of the original ones. Recently, the Knockoff filter is proposed to control the FDR when performing feature selection. However, its performance is sensitive to the choice of the expected FDR threshold. Motivated by these ideas, we propose a new method using pseudo-features to obtain an ideal tuning parameter. In particular, we present the Efficient Tuning of Lasso (ET-Lasso) to separate active and inactive features by adding permuted features as pseudo features in linear models. The pseudo-features are constructed to be inactive by nature, which can be used to obtain a cutoff to select the tuning parameter that separates active and inactive features. Experimental studies on both simulations and real-world data applications are provided to show that ET-Lasso can effectively and efficiently select active features under a wide range of scenarios.
引用
收藏
页码:607 / 616
页数:10
相关论文
共 50 条
  • [1] LASSO-type variable selection methods for high-dimensional data
    Fu, Guanghui
    Wang, Pan
    [J]. ADVANCES IN COMPUTATIONAL MODELING AND SIMULATION, PTS 1 AND 2, 2014, 444-445 : 604 - 609
  • [2] LASSO-TYPE RECOVERY OF SPARSE REPRESENTATIONS FOR HIGH-DIMENSIONAL DATA
    Meinshausen, Nicolai
    Yu, Bin
    [J]. ANNALS OF STATISTICS, 2009, 37 (01): : 246 - 270
  • [3] High-dimensional nonconvex LASSO-type M-estimators
    Beyhum, Jad
    Portier, Francois
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2024, 202
  • [4] Comparison of Lasso Type Estimators for High-Dimensional Data
    Kim, Jaehee
    [J]. COMMUNICATIONS FOR STATISTICAL APPLICATIONS AND METHODS, 2014, 21 (04) : 349 - 361
  • [5] Nonlinear regression modeling via the lasso-type regularization
    Tateishi, Shohei
    Matsui, Hidetoshi
    Konishi, Sadanori
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2010, 140 (05) : 1125 - 1134
  • [6] Hi-LASSO: High-Dimensional LASSO
    Kim, Youngsoon
    Hao, Jie
    Mallavarapu, Tejaswini
    Park, Joongyang
    Kang, Mingon
    [J]. IEEE ACCESS, 2019, 7 : 44562 - 44573
  • [7] A study on tuning parameter selection for the high-dimensional lasso
    Homrighausen, Darren
    McDonald, Daniel J.
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2018, 88 (15) : 2865 - 2892
  • [8] Adaptive Lasso in high-dimensional settings
    Lin, Zhengyan
    Xiang, Yanbiao
    Zhang, Caiya
    [J]. JOURNAL OF NONPARAMETRIC STATISTICS, 2009, 21 (06) : 683 - 696
  • [9] Localized Lasso for High-Dimensional Regression
    Yamada, Makoto
    Takeuchi, Koh
    Iwata, Tomoharu
    Shawe-Taylor, John
    Kaski, Samuel
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 325 - 333
  • [10] Lasso regularization for left-censored Gaussian outcome and high-dimensional predictors
    Soret, Perrine
    Avalos, Marta
    Wittkop, Linda
    Commenges, Daniel
    Thiebaut, Rodolphe
    [J]. BMC MEDICAL RESEARCH METHODOLOGY, 2018, 18