Robust regularized extreme learning machine for regression using iteratively reweighted least squares

被引:63
|
作者
Chen, Kai [1 ]
Lv, Qi [1 ]
Lu, Yao [2 ]
Dou, Yong [1 ]
机构
[1] Natl Univ Def Technol, Natl Lab Parallel & Distributed Proc, Changsha, Hunan, Peoples R China
[2] Natl Univ Def Technol, Coll Comp, Changsha, Hunan, Peoples R China
关键词
Extreme learning machine; Iteratively reweighted least squares; Robustness; l(2)-norm regularization; l(1)-norm regularization;
D O I
10.1016/j.neucom.2016.12.029
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Extreme learning machine (ELM) for regression has been used in many fields because of its easy-implementation, fast training speed and good generalization performance. However, basic ELM with l(2)-norm loss function is sensitive to outliers. Recently, l(1)-norm loss function and Huber loss function have been used in ELM to enhance the robustness. However, the l(1)-norm loss function and the Huber loss function can also be effected by outliers because of their linear correlation with the errors. Moreover, existing robust ELM methods only use l(2)-norm regularization or have no regularization term. In this study, we propose a unified model for robust regularized ELM regression using iteratively reweighted least squares (IRLS), and call it RELM-IRLS. We perform a comprehensive study on the robust loss function and regularization term for robust ELM regression. Four loss functions (i.e., l(1)-norm, Huber, Bisquare and Welsch) are used to enhance the robustness, and two types of regularization (l(2)-norm and l(1)-norm) are used to avoid overfitting. Experiments show that our proposed RELM-IRLS with l(2)-norm and l(1)-norm regularization is stable and accurate for data with 0 similar to 40% outlier levels, and that RELM-IRLS with l(1)-norm regularization can obtain a compact network because of the highly sparse output weights of the network.
引用
收藏
页码:345 / 358
页数:14
相关论文
共 50 条
  • [1] ROBUST REGRESSION COMPUTATION USING ITERATIVELY REWEIGHTED LEAST-SQUARES
    OLEARY, DP
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 1990, 11 (03) : 466 - 480
  • [2] Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems
    Mukhoty, Bhaskar
    Gopakumar, Govind
    Jain, Prateek
    Kar, Purushottam
    [J]. 22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89 : 313 - 322
  • [3] Iteratively reweighted least squares based learning
    Warner, BA
    Misra, M
    [J]. IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 1327 - 1331
  • [4] Robust registration of point sets using iteratively reweighted least squares
    Per Bergström
    Ove Edlund
    [J]. Computational Optimization and Applications, 2014, 58 : 543 - 561
  • [5] Robust registration of point sets using iteratively reweighted least squares
    Bergstrom, Per
    Edlund, Ove
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2014, 58 (03) : 543 - 561
  • [6] Robust spectrotemporal decomposition by iteratively reweighted least squares
    Ba, Demba
    Babadi, Behtash
    Purdon, Patrick L.
    Brown, Emery N.
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2014, 111 (50) : E5336 - E5345
  • [7] A NOTE ON COMPUTING ROBUST REGRESSION ESTIMATES VIA ITERATIVELY REWEIGHTED LEAST-SQUARES
    STREET, JO
    CARROLL, RJ
    RUPPERT, D
    [J]. AMERICAN STATISTICIAN, 1988, 42 (02): : 152 - 154
  • [8] FAST ITERATIVELY REWEIGHTED LEAST SQUARES FOR LP REGULARIZED IMAGE DECONVOLUTION AND RECONSTRUCTION
    Zhou, Xu
    Molina, Rafael
    Zhou, Fugen
    Katsaggelos, Aggelos K.
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2014, : 1783 - 1787
  • [10] Kernel-based regression via a novel robust loss function and iteratively reweighted least squares
    Dong, Hongwei
    Yang, Liming
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2021, 63 (05) : 1149 - 1172