Back-propagation extreme learning machine

被引:0
|
作者
Weidong Zou
Yuanqing Xia
Weipeng Cao
机构
[1] School of Automation,College of Computer Science and Software Engineering
[2] Beijing Institute of Technology,undefined
[3] Shenzhen University,undefined
来源
Soft Computing | 2022年 / 26卷
关键词
I-ELM; Input parameters; Residual error; Convergence rate; Generalization performance; Stability;
D O I
暂无
中图分类号
学科分类号
摘要
Incremental Extreme Learning Machine (I-ELM) is a typical constructive feed-forward neural network with random hidden nodes, which can automatically determine the appropriate number of hidden nodes. However I-ELM and its variants suffer from a notorious problem, that is, the input parameters of these algorithms are randomly assigned and kept fixed throughout the training process, which results in a very unstable performance of the model. To solve this problem, we propose a novel Back-Propagation ELM (BP-ELM) in this study, which can dynamically assign the most appropriate input parameters according to the current residual error of the model during the increasing process of the hidden nodes. In this way, BP-ELM can greatly improve the quality of newly added nodes and then accelerate the convergence rate and improve the model performance. Moreover, under the same error level, the network structure of the model obtained by BP-ELM is more compact than that of the I-ELM. We also prove the universal approximation ability of BP-ELM in this study. Experimental results on three benchmark regression problems and a real-life traffic flow prediction problem empirically show that BP-ELM has better stability and generalization ability than other I-ELM-based algorithms.
引用
收藏
页码:9179 / 9188
页数:9
相关论文
共 50 条
  • [1] Back-propagation extreme learning machine
    Zou, Weidong
    Xia, Yuanqing
    Cao, Weipeng
    [J]. SOFT COMPUTING, 2022, 26 (18) : 9179 - 9188
  • [2] Privacy-preserving back-propagation and extreme learning machine algorithms
    Samet, Saeed
    Miri, Ali
    [J]. DATA & KNOWLEDGE ENGINEERING, 2012, 79-80 : 40 - 61
  • [3] Awesome back-propagation machine learning paradigm
    Assem Badr
    [J]. Neural Computing and Applications, 2021, 33 : 13225 - 13249
  • [4] Awesome back-propagation machine learning paradigm
    Badr, Assem
    [J]. NEURAL COMPUTING & APPLICATIONS, 2021, 33 (20): : 13225 - 13249
  • [5] Convolutional Neural Network Features Comparison Between Back-Propagation and Extreme Learning Machine
    Khellal, Atmane
    Ma, Hongbin
    Fei, Qing
    [J]. 2018 37TH CHINESE CONTROL CONFERENCE (CCC), 2018, : 9629 - 9634
  • [6] Back Propagation Convex Extreme Learning Machine
    Zou, Weidong
    Yao, Fenxi
    Zhang, Baihai
    Guan, Zixiao
    [J]. PROCEEDINGS OF ELM-2016, 2018, 9 : 259 - 272
  • [7] Model Reference Adaptive Neural Control for Nonlinear Systems Based on Back-Propagation and Extreme Learning Machine
    Rong, Hai-Jun
    Bao, Rong-Jing
    Zhao, Guang-She
    [J]. 2014 IEEE NINTH INTERNATIONAL CONFERENCE ON INTELLIGENT SENSORS, SENSOR NETWORKS AND INFORMATION PROCESSING (IEEE ISSNIP 2014), 2014,
  • [8] ACCELERATED LEARNING IN BACK-PROPAGATION NETS
    SCHMIDHUBER, J
    [J]. CONNECTIONISM IN PERSPECTIVE, 1989, : 439 - 445
  • [9] BACK-PROPAGATION LEARNING IN EXPERT NETWORKS
    LACHER, RC
    HRUSKA, SI
    KUNCICKY, DC
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01): : 62 - 72
  • [10] IMPLEMENTING RECURRENT BACK-PROPAGATION ON THE CONNECTION MACHINE
    DEPRIT, E
    [J]. NEURAL NETWORKS, 1989, 2 (04) : 295 - 314