Fast optimization methods for L1 regularization: A comparative study and two new approaches

被引:0
|
作者
Schmidt, Mark [1 ]
Fung, Glenn [2 ]
Rosales, Romer [2 ]
机构
[1] Univ British Columbia, Dept Comp Sci, Vancouver, BC V6T 1W5, Canada
[2] IKM CKS, Siemens Med Solut, Malvern, PA USA
来源
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
L1 regularization is effective for feature selection, but the resulting optimization is challenging due to the non-differentiability of the 1-norm. In this paper we compare state-of-the-art optimization techniques to solve this problem across several loss functions. Furthermore, we propose two new techniques. The first is based on a smooth (differentiable) convex approximation for the L1 regularizer that does not depend on any assumptions about the loss function used. The other technique is a new strategy that addresses the non-differentiability of the L1-regularizer by casting the problem as a constrained optimization problem that is then solved using a specialized gradient projection method. Extensive comparisons show that our newly proposed approaches consistently rank among the best in terms of convergence speed and efficiency by measuring the number of function evaluations required.
引用
收藏
页码:286 / +
页数:2
相关论文
共 50 条
  • [1] Iteratively Reweighted l1 Approaches to Sparse Composite Regularization
    Ahmad, Rizwan
    Schniter, Philip
    IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING, 2015, 1 (04) : 220 - 235
  • [2] Comparative Analysis of Structural Damage Identification Methods Based on Iterative Reweighted L1/2 Regularization and Three Optimization Functions
    Yan, Wanli
    Liu, Yong
    Yin, Xinfeng
    Liu, Yang
    Dong, Yingfei
    INTERNATIONAL JOURNAL OF STRUCTURAL STABILITY AND DYNAMICS, 2025, 25 (03)
  • [3] Structure Optimization of Neural Networks with L1 Regularization on Gates
    Chang, Qin
    Wang, Junze
    Zhang, Huaqing
    Shi, Lina
    Wang, Jian
    Pal, Nikhil R.
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 196 - 203
  • [4] A Simple Neural Network for Sparse Optimization With l1 Regularization
    Ma, Litao
    Bian, Wei
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (04): : 3430 - 3442
  • [5] L1/2 regularization
    ZongBen Xu
    Hai Zhang
    Yao Wang
    XiangYu Chang
    Yong Liang
    Science China Information Sciences, 2010, 53 : 1159 - 1169
  • [6] L1/2 regularization
    XU ZongBen 1
    2 Department of Mathematics
    3 University of Science and Technology
    Science China(Information Sciences), 2010, 53 (06) : 1159 - 1169
  • [7] L1/2 Regularization: A Thresholding Representation Theory and a Fast Solver
    Xu, Zongben
    Chang, Xiangyu
    Xu, Fengmin
    Zhang, Hai
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (07) : 1013 - 1027
  • [8] l1 Regularization in Two-Layer Neural Networks
    Li, Gen
    Gu, Yuantao
    Ding, Jie
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 135 - 139
  • [9] Total variation vs L1 regularization: a comparison of compressive sensing optimization methods for chemical detection
    Farnell, Elin
    Kvinge, Henry
    Dupuis, Julia R.
    Kirby, Michael
    Peterson, Chris
    Schundler, Elizabeth C.
    COMPUTATIONAL IMAGING V, 2020, 11396
  • [10] The Group-Lasso: l1,∞ Regularization versus l1,2 Regularization
    Vogt, Julia E.
    Roth, Volker
    PATTERN RECOGNITION, 2010, 6376 : 252 - 261