Robust Regression and Lasso

被引:97
|
作者
Xu, Huan [2 ]
Caramanis, Constantine [1 ]
Mannor, Shie [2 ]
机构
[1] Univ Texas Austin, Dept Elect & Comp Engn, Austin, TX 78712 USA
[2] McGill Univ, Dept Elect & Comp Engn, Montreal, PQ H3A 2A7, Canada
基金
以色列科学基金会;
关键词
Lasso; regression; regularization; robustness; sparsity; stability; statistical learning; REGULARIZATION; SELECTION;
D O I
10.1109/TIT.2010.2048503
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Lasso, or l(1) regularized least squares, has been explored extensively for its remarkable sparsity properties. In this paper it is shown that the solution to Lasso, in addition to its sparsity, has robustness properties: it is the solution to a robust optimization problem. This has two important consequences. First, robustness provides a connection of the regularizer to a physical property, namely, protection from noise. This allows a principled selection of the regularizer, and in particular, generalizations of Lasso that also yield convex optimization problems are obtained by considering different uncertainty sets. Second, robustness can itself be used as an avenue for exploring different properties of the solution. In particular, it is shown that robustness of the solution explains why the solution is sparse. The analysis as well as the specific results obtained differ from standard sparsity results, providing different geometric intuition. Furthermore, it is shown that the robust optimization formulation is related to kernel density estimation, and based on this approach, a proof that Lasso is consistent is given, using robustness directly. Finally, a theorem is proved which states that sparsity and algorithmic stability contradict each other, and hence Lasso is not stable.
引用
收藏
页码:3561 / 3574
页数:14
相关论文
共 50 条
  • [21] The group lasso for logistic regression
    Meier, Lukas
    van de Geer, Sara A.
    Buhlmann, Peter
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2008, 70 : 53 - 71
  • [22] Lazy lasso for local regression
    Diego Vidaurre
    Concha Bielza
    Pedro Larrañaga
    Computational Statistics, 2012, 27 : 531 - 550
  • [23] A Comparison of the Lasso and Marginal Regression
    Genovese, Christopher R.
    Jin, Jiashun
    Wasserman, Larry
    Yao, Zhigang
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 2107 - 2143
  • [24] Isotonic regression meets LASSO
    Neykov, Matey
    ELECTRONIC JOURNAL OF STATISTICS, 2019, 13 (01): : 710 - 746
  • [25] Marginalized lasso in sparse regression
    Seokho Lee
    Seonhwa Kim
    Journal of the Korean Statistical Society, 2019, 48 : 396 - 411
  • [26] A network Lasso model for regression
    Su, Meihong
    Wang, Wenjian
    COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2023, 52 (06) : 1702 - 1727
  • [27] Marginalized lasso in sparse regression
    Lee, Seokho
    Kim, Seonhwa
    JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2019, 48 (03) : 396 - 411
  • [28] Lazy lasso for local regression
    Vidaurre, Diego
    Bielza, Concha
    Larranaga, Pedro
    COMPUTATIONAL STATISTICS, 2012, 27 (03) : 531 - 550
  • [29] Adaptive Lasso and group-Lasso for functional Poisson regression
    Ivanoff, Stephane
    Picard, Franck
    Rivoirard, Vincent
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [30] Efficient Model Selection of Collector Efficiency in Solar Dryer using Hybrid of LASSO and Robust Regression
    Javaid, Anam
    Ismail, Mohd Tahir
    Ali, Majid Khan Majahar
    PERTANIKA JOURNAL OF SCIENCE AND TECHNOLOGY, 2020, 28 (01): : 193 - 210