Fault and Noise Tolerance in the Incremental Extreme Learning Machine

被引:15
|
作者
Leung, Ho Chun [1 ]
Leung, Chi Sing [1 ]
Wong, Eric Wing Ming [1 ]
机构
[1] City Univ Hong Kong, Dept Elect Engn, Hong Kong, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Single hidden layer network; incremental learning; extreme learning machine; multiplicative noise; open fault; NEURAL-NETWORKS; FEEDFORWARD NETWORKS; ERROR ANALYSIS; DESIGN;
D O I
10.1109/ACCESS.2019.2948059
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The extreme learning machine (ELM) is an efficient way to build single-hidden-layer feedforward networks (SLFNs). However, its fault tolerant ability is very weak. When node noise or node failure exist in a network trained by the ELM concept, the performance of the network is greatly degraded if a countermeasure is not taken. However, this kind of countermeasure for the ELM or incremental learning is seldom reported. This paper considers the situation that a trained SLFN suffers from the coexistence of node fault and node noise. We develop two fault tolerant incremental ELM algorithms for the regression problem, namely node fault tolerant incremental ELM (NFTI-ELM) and node fault tolerant convex incremental ELM (NFTCI-ELM). The NFTI-ELM determines the output weight of the newly inserted node only. We prove that in terms of the training set mean squared error (MSE) of faulty SLFNs, the NFTI-ELM converges. Our numerical results show that the NFTI-ELM is superior to the conventional ELM and incremental ELM algorithms under faulty situations. To further improve the performance, we propose the NFTCI-ELM algorithm. It not only determines the output weight of the newly inserted node, but also updates all previously trained output weights. In terms of training set MSE of faulty SLFNs, the NFTCI-ELM converges, and it is superior to the NFTI-ELM.
引用
收藏
页码:155171 / 155183
页数:13
相关论文
共 50 条
  • [21] Fault Tolerance of Cloud Infrastructure with Machine Learning
    Kalaskar, Chetankumar
    Thangam, S.
    CYBERNETICS AND INFORMATION TECHNOLOGIES, 2023, 23 (04) : 26 - 50
  • [22] On misbehaviour and fault tolerance in machine learning systems
    Myllyaho, Lalli
    Raatikainen, Mikko
    Mannisto, Tomi
    Nurminen, Jukka K.
    Mikkonen, Tommi
    JOURNAL OF SYSTEMS AND SOFTWARE, 2022, 183
  • [23] A Parallel Incremental Learning Algorithm for Neural Networks with Fault Tolerance
    Bahi, Jacques M.
    Contassot-Vivier, Sylvain
    Sauget, Marc
    Vasseur, Aurelien
    HIGH PERFORMANCE COMPUTING FOR COMPUTATIONAL SCIENCE - VECPAR 2008, 2008, 5336 : 174 - +
  • [24] Incremental extreme learning machine based on deep feature embedded
    Jian Zhang
    Shifei Ding
    Nan Zhang
    Zhongzhi Shi
    International Journal of Machine Learning and Cybernetics, 2016, 7 : 111 - 120
  • [25] Orthogonal incremental extreme learning machine for regression and multiclass classification
    Li Ying
    Neural Computing and Applications, 2016, 27 : 111 - 120
  • [26] Enhanced random search based incremental extreme learning machine
    Huang, Guang-Bin
    Chen, Lei
    NEUROCOMPUTING, 2008, 71 (16-18) : 3460 - 3468
  • [27] Incremental and Decremental Extreme Learning Machine Based on Generalized Inverse
    Jin, Bo
    Jing, Zhongliang
    Zhao, Haitao
    IEEE ACCESS, 2017, 5 : 20852 - 20865
  • [28] Incremental extreme learning machine with fully complex hidden nodes
    Huang, Guang-Bin
    Li, Ming-Bin
    Chen, Lei
    Siew, Chee-Kheong
    NEUROCOMPUTING, 2008, 71 (4-6) : 576 - 583
  • [29] Parallel Chaos Search Based Incremental Extreme Learning Machine
    Yimin Yang
    Yaonan Wang
    Xiaofang Yuan
    Neural Processing Letters, 2013, 37 : 277 - 301
  • [30] Incremental extreme learning machine based on deep feature embedded
    Zhang, Jian
    Ding, Shifei
    Zhang, Nan
    Shi, Zhongzhi
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2016, 7 (01) : 111 - 120