Fault and Noise Tolerance in the Incremental Extreme Learning Machine

被引:15
|
作者
Leung, Ho Chun [1 ]
Leung, Chi Sing [1 ]
Wong, Eric Wing Ming [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Single hidden layer network; incremental learning; extreme learning machine; multiplicative noise; open fault; NEURAL-NETWORKS; FEEDFORWARD NETWORKS; ERROR ANALYSIS; DESIGN;
D O I
10.1109/ACCESS.2019.2948059
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The extreme learning machine (ELM) is an efficient way to build single-hidden-layer feedforward networks (SLFNs). However, its fault tolerant ability is very weak. When node noise or node failure exist in a network trained by the ELM concept, the performance of the network is greatly degraded if a countermeasure is not taken. However, this kind of countermeasure for the ELM or incremental learning is seldom reported. This paper considers the situation that a trained SLFN suffers from the coexistence of node fault and node noise. We develop two fault tolerant incremental ELM algorithms for the regression problem, namely node fault tolerant incremental ELM (NFTI-ELM) and node fault tolerant convex incremental ELM (NFTCI-ELM). The NFTI-ELM determines the output weight of the newly inserted node only. We prove that in terms of the training set mean squared error (MSE) of faulty SLFNs, the NFTI-ELM converges. Our numerical results show that the NFTI-ELM is superior to the conventional ELM and incremental ELM algorithms under faulty situations. To further improve the performance, we propose the NFTCI-ELM algorithm. It not only determines the output weight of the newly inserted node, but also updates all previously trained output weights. In terms of training set MSE of faulty SLFNs, the NFTCI-ELM converges, and it is superior to the NFTI-ELM.
引用
收藏
页码:155171 / 155183
页数:13
相关论文
共 50 条
  • [41] Regularization incremental extreme learning machine with random reduced kernel for regression
    Zhou, Zhiyu
    Chen, Ji
    Zhu, Zefei
    NEUROCOMPUTING, 2018, 321 : 72 - 81
  • [42] Enhanced Fruit Fly Optimization Based Incremental Extreme Learning Machine
    Liu, Zuozhi
    Yuan, Quan
    BASIC & CLINICAL PHARMACOLOGY & TOXICOLOGY, 2020, 127 : 127 - 127
  • [43] Byzantine fault tolerance in distributed machine learning: a survey
    Bouhata, Djamila
    Moumen, Hamouma
    Mazari, Jocelyn Ahmed
    Bounceur, Ahcene
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2024,
  • [44] Integrated Optimization Method of Hidden Parameters in Incremental Extreme Learning Machine
    Zhang, Siyuan
    Xie, Linbo
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [45] Incremental Extreme Learning Machine via Fast Random Search Method
    Lao, Zhihui
    Zho, Zhiheng
    Huang, Junchu
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 82 - 90
  • [46] L1-PLS Based on Incremental Extreme Learning Machine
    Sun, Zhiying
    Zhou, Jinglin
    PROCEEDINGS OF 2020 IEEE 9TH DATA DRIVEN CONTROL AND LEARNING SYSTEMS CONFERENCE (DDCLS'20), 2020, : 947 - 952
  • [47] Wind Speed Forecast for the Stratospheric Airship by Incremental Extreme Learning Machine
    Xu Guangyu
    Shen Shaoping
    Sun Jietao
    PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, : 4088 - 4092
  • [48] A novel visual tracking system with adaptive incremental extreme learning machine
    Wang, Zhihui
    Yoon, Sook
    Park, Dong Sun
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2017, 11 (01): : 451 - 465
  • [49] Gram-Schmidt process based incremental extreme learning machine
    Zhao, Yong-Ping
    Li, Zhi-Qiang
    Xi, Peng-Peng
    Liang, Dong
    Sun, Liguo
    Chen, Ting-Hao
    NEUROCOMPUTING, 2017, 241 : 1 - 17
  • [50] A fast incremental extreme learning machine algorithm for data streams classification
    Xu, Shuliang
    Wang, Junhong
    EXPERT SYSTEMS WITH APPLICATIONS, 2016, 65 : 332 - 344