A fast learning method for feedforward neural networks

被引:17
|
作者
Wang, Shitong [1 ,2 ]
Chung, Fu-Lai [2 ]
Wang, Jun [1 ]
Wu, Jun [1 ]
机构
[1] Jiangnan Univ, Sch Digital Media, Wuxi, Jiangsu, Peoples R China
[2] Hong Kong Polytech Univ, Dept Comp, Hong Kong, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Fast learning method; Feedforward neural network; Extreme learning machine; Hidden-feature-space ridge regression; MACHINE; APPROXIMATION; REGRESSION;
D O I
10.1016/j.neucom.2014.01.065
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to circumvent the weakness of very slow convergence of most traditional learning algorithms for single layer feedforward neural networks, the extreme learning machines (ELM) has been recently developed to achieve extremely fast learning with good performance by training only for the output weights. However, it cannot be applied to multiple-hidden layer feedforward neural networks (MI.FN), which is a challenging bottleneck of ELM. In this work, the novel fast learning method (KM) for feedforward neural networks is proposed. Firstly, based on the existing ridge regression theories, the hidden-feature-space ridge regression (HFSR) and centered ridge regression Centered-ELM are presented. Their connection with ELM is also theoretically revealed. As special kernel methods, they can inherently be used to propagate the prominent advantages of ELM into MU:N. Then, a novel fast learning method FLM for teedforward neural networks is proposed as a unified framework for HFSR and Centered-ELM. FLM can be applied for both SLFN and MLFN with a single or multiple outputs. In FLM, only the parameters in the last hidden layer require being adjusted while all the parameters in other hidden layers can be randomly assigned. The proposed FLM was tested against state of the art methods on real-world datasets and it provides better and more reliable results. (C) 2014 Elsevier B.V. All rights reserved.
引用
收藏
页码:295 / 307
页数:13
相关论文
共 50 条
  • [1] Fast learning algorithms for feedforward neural networks
    Jiang, MH
    Gielen, G
    Zhang, B
    Luo, ZS
    [J]. APPLIED INTELLIGENCE, 2003, 18 (01) : 37 - 54
  • [2] Fast Learning Algorithms for Feedforward Neural Networks
    Minghu Jiang
    Georges Gielen
    Bo Zhang
    Zhensheng Luo
    [J]. Applied Intelligence, 2003, 18 : 37 - 54
  • [3] A FAST AND ROBUST LEARNING ALGORITHM FOR FEEDFORWARD NEURAL NETWORKS
    WEYMAERE, N
    MARTENS, JP
    [J]. NEURAL NETWORKS, 1991, 4 (03) : 361 - 369
  • [4] A fast learning strategy for multilayer feedforward neural networks
    Chen, Huawei
    Zhong, Hualan
    Yuan, Haiying
    Jin, Fan
    [J]. WCICA 2006: SIXTH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-12, CONFERENCE PROCEEDINGS, 2006, : 3019 - +
  • [5] A fast learning algorithm for training feedforward neural networks
    Goel, Ashok Kumar
    Saxena, Suresh C.
    Bhanot, Surekha
    [J]. INTERNATIONAL JOURNAL OF SYSTEMS SCIENCE, 2006, 37 (10) : 709 - 722
  • [6] OPTIMAL FILTERING ALGORITHMS FOR FAST LEARNING IN FEEDFORWARD NEURAL NETWORKS
    SHAH, S
    PALMIERI, F
    DATUM, M
    [J]. NEURAL NETWORKS, 1992, 5 (05) : 779 - 787
  • [7] Modification of Learning Feedforward Neural Networks with the BP Method
    Bilski, Jaroslaw
    Smolag, Jacek
    Najgebauer, Patryk
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING (ICAISC 2021), PT I, 2021, 12854 : 54 - 65
  • [8] A fast and robust recursive prediction error learning algorithm for feedforward neural networks
    Zhang, YM
    Li, XR
    [J]. PROCEEDINGS OF THE 35TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-4, 1996, : 2036 - 2041
  • [9] A fast learning algorithm of feedforward neural networks by using novel error functions
    Jiang, MH
    Deng, BX
    Gielen, G
    Tang, XF
    Ruan, QQ
    Yuan, BZ
    [J]. 2002 6TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING PROCEEDINGS, VOLS I AND II, 2002, : 1171 - 1174
  • [10] Topology of Learning in Feedforward Neural Networks
    Gabella, Maxime
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (08) : 3588 - 3592