Some sharp performance bounds for least squares regression with L1 regularization

被引:123
|
作者
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
来源
ANNALS OF STATISTICS | 2009年 / 37卷 / 5A期
关键词
L-1; regularization; Lasso; regression; sparsity; variable selection; parameter estimation; STATISTICAL ESTIMATION; DANTZIG SELECTOR; SPARSITY; LARGER; LASSO;
D O I
10.1214/08-AOS659
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We derive sharp performance bounds for least squares regression With L-1 regularization front parameter estimation accuracy and feature selection quality perspectives. The main result proved for L-1 regularization extends it similar result in [Ann. Statist. 35 (2007) 2313-2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358-2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L-1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
引用
收藏
页码:2109 / 2144
页数:36
相关论文
共 50 条
  • [31] The Group-Lasso: l1,∞ Regularization versus l1,2 Regularization
    Vogt, Julia E.
    Roth, Volker
    PATTERN RECOGNITION, 2010, 6376 : 252 - 261
  • [32] Relating lp regularization and reweighted l1 regularization
    Wang, Hao
    Zeng, Hao
    Wang, Jiashan
    Wu, Qiong
    OPTIMIZATION LETTERS, 2021, 15 (08) : 2639 - 2660
  • [33] Constructive Analysis for Least Squares Regression with Generalized K-Norm Regularization
    Wang, Cheng
    Nie, Weilin
    ABSTRACT AND APPLIED ANALYSIS, 2014,
  • [34] Efficient Hardware Implementation of the l1 - Regularized Least Squares for IoT Edge Computing
    Baali, Hamza
    Djelouat, Hamza
    Amira, Abbess
    Bensaali, Faycal
    Zhai, Xiaojun
    2017 IEEE 17TH INTERNATIONAL CONFERENCE ON UBIQUITOUS WIRELESS BROADBAND (ICUWB), 2017,
  • [35] Combined l1 and Greedy l0 Penalized Least Squares for Linear Model Selection
    Pokarowski, Piotr
    Mielniczuk, Jan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2015, 16 : 961 - 992
  • [36] Improving the Performance of the PNLMS Algorithm Using l1 Norm Regularization
    Das, Rajib Lochan
    Chakraborty, Mrityunjoy
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2016, 24 (07) : 1280 - 1290
  • [38] Based on rough sets and L1 regularization of the fault diagnosis of linear regression model
    Yao Hong-wei
    Tong XinDi
    2016 INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION, BIG DATA & SMART CITY (ICITBS), 2017, : 490 - 492
  • [39] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS
    Li, Lin
    Fan, Qinwei
    Zhou, Li
    JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022
  • [40] RECURRENT NEURAL NETWORK WITH L1/2 REGULARIZATION FOR REGRESSION AND MULTICLASS CLASSIFICATION PROBLEMS
    Li, Lin
    Fan, Qinwei
    Zhou, Li
    JOURNAL OF NONLINEAR FUNCTIONAL ANALYSIS, 2022, 2022