Some sharp performance bounds for least squares regression with L1 regularization

被引:123
|
作者
Zhang, Tong [1 ]
机构
[1] Rutgers State Univ, Dept Stat, Piscataway, NJ 08854 USA
来源
ANNALS OF STATISTICS | 2009年 / 37卷 / 5A期
关键词
L-1; regularization; Lasso; regression; sparsity; variable selection; parameter estimation; STATISTICAL ESTIMATION; DANTZIG SELECTOR; SPARSITY; LARGER; LASSO;
D O I
10.1214/08-AOS659
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We derive sharp performance bounds for least squares regression With L-1 regularization front parameter estimation accuracy and feature selection quality perspectives. The main result proved for L-1 regularization extends it similar result in [Ann. Statist. 35 (2007) 2313-2351] for the Dantzig selector. It gives an affirmative answer to an open question in [Ann. Statist. 35 (2007) 2358-2364]. Moreover, the result leads to an extended view of feature selection that allows less restrictive conditions than some recent work. Based on the theoretical insights, a novel two-stage L-1-regularization procedure with selective penalization is analyzed. It is shown that if the target parameter vector can be decomposed as the sum of a sparse parameter vector with large coefficients and another less sparse vector with relatively small coefficients, then the two-stage procedure can lead to improved performance.
引用
收藏
页码:2109 / 2144
页数:36
相关论文
共 50 条
  • [1] DISCRIMINANT AND SPARSITY BASED LEAST SQUARES REGRESSION WITH l1 REGULARIZATION FOR FEATURE REPRESENTATION
    Zhao, Shuping
    Zhang, Bob
    Li, Shuyi
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 1504 - 1508
  • [2] A recursive least squares algorithm with l1 regularization for sparse representation
    Liu, Di
    Baldi, Simone
    Liu, Quan
    Yu, Wenwu
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (02)
  • [3] Improved Convergence for l∞ and l1 Regression via Iteratively Reweighted Least Squares
    Ene, Alina
    Vladu, Adrian
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [4] Robust censored regression with l1 -norm regularization
    Beyhum, Jad
    Van Keilegom, Ingrid
    TEST, 2023, 32 (01) : 146 - 162
  • [5] Local regularization assisted orthogonal least squares regression
    Chen, S
    NEUROCOMPUTING, 2006, 69 (4-6) : 559 - 585
  • [6] Aggregation and sparsity via l1 penalized least squares
    Bunea, Florentina
    Tsybakov, Alexandre B.
    Wegkamp, Marten H.
    LEARNING THEORY, PROCEEDINGS, 2006, 4005 : 379 - 391
  • [7] Least-squares RTM with L1 norm regularisation
    Wu, Di
    Yao, Gang
    Cao, Jingjie
    Wang, Yanghua
    JOURNAL OF GEOPHYSICS AND ENGINEERING, 2016, 13 (05) : 666 - 673
  • [8] A Sharp Nonasymptotic Bound and Phase Diagram of L1/2 Regularization
    Zhang, Hai
    Xu, Zong Ben
    Wang, Yao
    Chang, Xiang Yu
    Liang, Yong
    ACTA MATHEMATICA SINICA-ENGLISH SERIES, 2014, 30 (07) : 1242 - 1258
  • [9] New Bounds on Compressive Linear Least Squares Regression
    Kaban, Ata
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 33, 2014, 33 : 448 - 456
  • [10] A Sharp Nonasymptotic Bound and Phase Diagram of L1/2 Regularization
    Hai ZHANG
    Zong Ben XU
    Yao WANG
    Xiang Yu CHANG
    Yong LIANG
    Acta Mathematica Sinica,English Series, 2014, 30 (07) : 1242 - 1258