Parsimonious regularized extreme learning machine based on orthogonal transformation

被引:12
|
作者
Zhao, Yong-Ping [1 ]
Wang, Kang-Kang [1 ]
Li, Ye-Bo [2 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Mech Engn, Nanjing 210094, Jiangsu, Peoples R China
[2] AVIC Aeroengine Control Res Inst, Wuxi 214063, Peoples R China
基金
中国国家自然科学基金;
关键词
Extreme learning machine; Sparseness; Tikhonov regularization; Orthogonal transformation; Condition number; FEEDFORWARD NETWORKS; NEURAL-NETWORKS; CLASSIFICATION; OPTIMIZATION; REGRESSION; IDENTIFICATION; APPROXIMATION; ALGORITHM; SELECTION; ELM;
D O I
10.1016/j.neucom.2014.12.046
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, two parsimonious algorithms were proposed to sparsify extreme learning machine (ELM), i.e., constructive parsimonious ELM (CP-ELM) and destructive parsimonious ELM (DP-ELM). In this paper, the ideas behind CP-ELM and DP-ELM are extended to the regularized ELM (RELM), thus obtaining CP-RELM and DP-RELM. For CP-RELM(DP-RELM), there are two schemes to realize it, viz. CP-RELM-I and CP-RELM-II(DP-RELM-I and DP-RELM-II). Generally speaking, CP-RELM-II(DP-RELM-II) outperforms CP-RELM-I(DP-RELM-I) in terms of parsimoniousness. Under nearly the same generalization, compared with CP-ELM (DP-ELM), CP-RELM-II(DP-RELM-II) usually needs fewer hidden nodes. In addition, different from CP-ELM and DP-ELM, for CP-RELM and DP-RELM the number of candidate hidden nodes may be larger than the number of training samples, which assists the selection of much better hidden nodes for constructing more compact networks. Finally, eleven benchmark data sets divided into two groups are utilized to do experiments and the usefulness of the proposed algorithms is reported. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:280 / 296
页数:17
相关论文
共 50 条
  • [1] Parsimonious Extreme Learning Machine Using Recursive Orthogonal Least Squares
    Wang, Ning
    Er, Meng Joo
    Han, Min
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (10) : 1828 - 1841
  • [2] Improvements on parsimonious extreme learning machine using recursive orthogonal least squares
    Zhao, Yong-Ping
    Huerta, Ramon
    [J]. NEUROCOMPUTING, 2016, 191 : 82 - 94
  • [3] Regularized Extreme Learning Machine
    Deng, Wanyu
    Zheng, Qinghua
    Chen, Lin
    [J]. 2009 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DATA MINING, 2009, : 389 - 395
  • [4] Parsimonious wavelet kernel extreme learning machine
    Department of Information Technology, Hainan Medical University, Haikou
    571101, China
    不详
    430074, China
    不详
    NV
    89154, United States
    不详
    571101, China
    不详
    571101, China
    [J]. J. Eng. Sci. Technol. Rev., 5 (219-226):
  • [5] Manifold regularized extreme learning machine
    Liu, Bing
    Xia, Shi-Xiong
    Meng, Fan-Rong
    Zhou, Yong
    [J]. NEURAL COMPUTING & APPLICATIONS, 2016, 27 (02): : 255 - 269
  • [6] Smoothing Regularized Extreme Learning Machine
    Fan, Qin-Wei
    He, Xing-Shi
    Yang, Xin-She
    [J]. ENGINEERING APPLICATIONS OF NEURAL NETWORKS, EANN 2018, 2018, 893 : 83 - 93
  • [7] Manifold regularized extreme learning machine
    Bing Liu
    Shi-Xiong Xia
    Fan-Rong Meng
    Yong Zhou
    [J]. Neural Computing and Applications, 2016, 27 : 255 - 269
  • [8] An accelerating scheme for destructive parsimonious extreme learning machine
    Zhao, Yong-Ping
    Li, Bing
    Li, Ye-Bo
    [J]. NEUROCOMPUTING, 2015, 167 : 671 - 687
  • [9] An Adaptive Learning Algorithm for Regularized Extreme Learning Machine
    Zhang, Yuao
    Wu, Qingbiao
    Hu, Jueliang
    [J]. IEEE ACCESS, 2021, 9 : 20736 - 20745
  • [10] Timeliness Online Regularized Extreme Learning Machine
    Luo, Xiong
    Yang, Xiaona
    Jiang, Changwei
    Ban, Xiaojuan
    [J]. PROCEEDINGS OF ELM-2015, VOL 1: THEORY, ALGORITHMS AND APPLICATIONS (I), 2016, 6 : 477 - 487