A General Framework for Sparsity Regularized Feature Selection via Iteratively Reweighted Least Square Minimization

被引:0
|
作者
Peng, Hanyang [1 ,2 ]
Fan, Yong [3 ]
机构
[1] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing 100190, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100190, Peoples R China
[3] Univ Penn, Dept Radiol, Perelman Sch Med, Philadelphia, PA 19104 USA
基金
中国国家自然科学基金;
关键词
SUPPORT VECTOR MACHINES; RECONSTRUCTION; CLASSIFICATION; REGRESSION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A variety of feature selection methods based on sparsity regularization have been developed with different loss functions and sparse regularization functions. Capitalizing on the existing sparsity regularized feature selection methods, we propose a general sparsity feature selection (GSR-FS) algorithm that optimizes a l(2,r)-norm (0 < r <= 2) based loss function with a l(2,p)-norm (0 < p <= 1) sparse regularization function. The l(2,r)-norm (0 < r <= 2) based loss function brings flexibility to balance data-fitting and robustness to outliers by tuning its parameter, and the l(2,p)-norm (0 < p <= 1) based regularization function is able to boost the sparsity for feature selection. To solve the optimization problem with multiple non-smooth and non-convex functions when r, p < 1, we develop an efficient solver under the general umbrella of Iterative Reweighted Least Square (IRLS) algorithms. Our algorithm has been proved to converge with a theoretical convergence order of at least min(2 - r, 2 - p). The experimental results have demonstrated that our method could achieve competitive feature selection performance on publicly available datasets compared with state-of-the-art feature selection methods, with reduced computational cost.
引用
收藏
页码:2471 / 2477
页数:7
相关论文
共 50 条
  • [1] INCREMENTAL LOCALIZATION ALGORITHM BASED ON REGULARIZED ITERATIVELY REWEIGHTED LEAST SQUARE
    Yan, Xiaoyong
    Yang, Zhong
    Liu, Yu
    Xu, Xiaoduo
    Li, Huijun
    [J]. FOUNDATIONS OF COMPUTING AND DECISION SCIENCES, 2016, 41 (03) : 183 - 196
  • [2] Incremental localization algorithm based on regularized iteratively reweighted least square
    Yan, Xiaoyong
    Song, Aiguo
    Liu, Yu
    He, Jian
    Zhu, Ronghui
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON SMART CITY/SOCIALCOM/SUSTAINCOM (SMARTCITY), 2015, : 729 - 733
  • [3] SPARSE VARIABLE SELECTION VIA ITERATIVELY REWEIGHTED LEAST SQUARES
    Deng, Haisong
    Ma, Yizhong
    Shao, Wenze
    [J]. PROCEEDINGS OF THE 38TH INTERNATIONAL CONFERENCE ON COMPUTERS AND INDUSTRIAL ENGINEERING, VOLS 1-3, 2008, : 467 - 472
  • [4] Nonlinear residual minimization by iteratively reweighted least squares
    Sigl, Juliane
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2016, 64 (03) : 755 - 792
  • [5] Iteratively Reweighted Least Squares Minimization for Sparse Recovery
    Daubechies, Ingrid
    Devore, Ronald
    Fornasier, Massimo
    Guentuerk, C. Sinan
    [J]. COMMUNICATIONS ON PURE AND APPLIED MATHEMATICS, 2010, 63 (01) : 1 - 38
  • [6] Nonlinear residual minimization by iteratively reweighted least squares
    Juliane Sigl
    [J]. Computational Optimization and Applications, 2016, 64 : 755 - 792
  • [7] Fast General Norm Approximation via Iteratively Reweighted Least Squares
    Samejima, Masaki
    Matsushita, Yasuyuki
    [J]. COMPUTER VISION - ACCV 2016 WORKSHOPS, PT II, 2017, 10117 : 207 - 221
  • [8] LOW-RANK MATRIX RECOVERY VIA ITERATIVELY REWEIGHTED LEAST SQUARES MINIMIZATION
    Fornasier, Massimo
    Rauhut, Holger
    Ward, Rachel
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2011, 21 (04) : 1614 - 1640
  • [9] Regularized least square discriminant projection and feature selection
    Shi, Jun
    Jiang, Zhiguo
    Zhao, Danpei
    Feng, Hao
    Gao, Chao
    [J]. JOURNAL OF ELECTRONIC IMAGING, 2014, 23 (01)
  • [10] Fast Iteratively Reweighted Least Squares Minimization for Sparse Recovery
    Liu, Kaihui
    Wan, Liangtian
    Wang, Feiyu
    [J]. 2018 IEEE 23RD INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2018,