Fault and Noise Tolerance in the Incremental Extreme Learning Machine

被引:15
|
作者
Leung, Ho Chun [1 ]
Leung, Chi Sing [1 ]
Wong, Eric Wing Ming [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Single hidden layer network; incremental learning; extreme learning machine; multiplicative noise; open fault; NEURAL-NETWORKS; FEEDFORWARD NETWORKS; ERROR ANALYSIS; DESIGN;
D O I
10.1109/ACCESS.2019.2948059
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The extreme learning machine (ELM) is an efficient way to build single-hidden-layer feedforward networks (SLFNs). However, its fault tolerant ability is very weak. When node noise or node failure exist in a network trained by the ELM concept, the performance of the network is greatly degraded if a countermeasure is not taken. However, this kind of countermeasure for the ELM or incremental learning is seldom reported. This paper considers the situation that a trained SLFN suffers from the coexistence of node fault and node noise. We develop two fault tolerant incremental ELM algorithms for the regression problem, namely node fault tolerant incremental ELM (NFTI-ELM) and node fault tolerant convex incremental ELM (NFTCI-ELM). The NFTI-ELM determines the output weight of the newly inserted node only. We prove that in terms of the training set mean squared error (MSE) of faulty SLFNs, the NFTI-ELM converges. Our numerical results show that the NFTI-ELM is superior to the conventional ELM and incremental ELM algorithms under faulty situations. To further improve the performance, we propose the NFTCI-ELM algorithm. It not only determines the output weight of the newly inserted node, but also updates all previously trained output weights. In terms of training set MSE of faulty SLFNs, the NFTCI-ELM converges, and it is superior to the NFTI-ELM.
引用
收藏
页码:155171 / 155183
页数:13
相关论文
共 50 条
  • [1] Noise/fault aware regularization for incremental learning in extreme learning machines
    Wong, Hiu-Tung
    Leung, Ho-Chun
    Leung, Chi-Sing
    Wong, Eric
    NEUROCOMPUTING, 2022, 486 : 200 - 214
  • [2] Robust Incremental Extreme Learning Machine
    Shao, Zhifei
    Er, Meng Joo
    Wang, Ning
    2014 13TH INTERNATIONAL CONFERENCE ON CONTROL AUTOMATION ROBOTICS & VISION (ICARCV), 2014, : 607 - 612
  • [3] Incremental constructive extreme learning machine
    Li, Fan-Jun
    Qiao, Jun-Fei
    Han, Hong-Gui
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2014, 31 (05): : 638 - 643
  • [4] Convex incremental extreme learning machine
    Huang, Guang-Bin
    Chen, Lei
    NEUROCOMPUTING, 2007, 70 (16-18) : 3056 - 3062
  • [5] An improved algorithm for incremental extreme learning machine
    Song, Shaojian
    Wang, Miao
    Lin, Yuzhang
    SYSTEMS SCIENCE & CONTROL ENGINEERING, 2020, 8 (01) : 308 - 317
  • [6] Sparse pseudoinverse incremental extreme learning machine
    Kassani, Peyman Hosseinzadeh
    Teoh, Andrew Beng Jin
    Kim, Euntai
    NEUROCOMPUTING, 2018, 287 : 128 - 142
  • [7] Fault-Tolerant Incremental Learning for Extreme Learning Machines
    Leung, Ho-Chun
    Leung, Chi-Sing
    Wong, Eric W. M.
    NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 168 - 176
  • [8] An incremental extreme learning machine for online sequential learning problems
    Guo, Lu
    Hao, Jing-hua
    Liu, Min
    NEUROCOMPUTING, 2014, 128 : 50 - 58
  • [9] Incremental laplacian regularization extreme learning machine for online learning
    Yang, Lixia
    Yang, Shuyuan
    Li, Sujing
    Liu, Zhi
    Jiao, Licheng
    APPLIED SOFT COMPUTING, 2017, 59 : 546 - 555
  • [10] Incremental regularized extreme learning machine and it's enhancement
    Xu, Zhixin
    Yao, Min
    Wu, Zhaohui
    Dai, Weihui
    NEUROCOMPUTING, 2016, 174 : 134 - 142