Accelerating the training of feedforward neural networks using generalized Hebbian rules for initializing the internal representations

被引:17
|
作者
Karayiannis, NB
机构
[1] Department of Electrical and Computer Engineering, University of Houston
来源
关键词
D O I
10.1109/72.485677
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents an unsupervised learning scheme for initializing the internal representations of feedforward neural networks, which accelerates the convergence of supervised learning algorithms. It is proposed in this paper that the initial set of internal representations can be formed through a bottom-up unsupervised learning process applied before the top-down supervised training algorithm. The synaptic weights that connect the input of the network with the hidden units can be determined through linear or nonlinear variations of a generalized Hebbian learning rule, known as Oja's rule. Various generalized Hebbian rules were experimentally tested and evaluated in terms of their effect on the convergence of the supervised training process. Several experiments indicated that the use of the proposed initialization of the internal representations significantly improves the convergence of gradient-descent-based algorithms used to perform nontrivial training tasks. The improvement of the convergence becomes significant as the size and complexity of the training task increase.
引用
收藏
页码:419 / 426
页数:8
相关论文
共 50 条
  • [31] A novel conjugate gradient method with generalized Armijo search for efficient training of feedforward neural networks
    Wang, Jian
    Zhang, Bingjie
    Sun, Zhanquan
    Hao, Wenxue
    Sun, Qingying
    [J]. NEUROCOMPUTING, 2018, 275 : 308 - 316
  • [32] Generalized analytic rule extraction for feedforward neural networks
    Gupta, A
    Park, S
    Lam, SM
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 1999, 11 (06) : 985 - 991
  • [33] Generalized analytic rule extraction for feedforward neural networks
    Andersen Consulting, 3773 Willow Road, Northbrook, IL 60062, United States
    不详
    不详
    [J]. IEEE Trans Knowl Data Eng, 6 (985-991):
  • [34] A FAST NEW ALGORITHM FOR TRAINING FEEDFORWARD NEURAL NETWORKS
    SCALERO, RS
    TEPEDELENLIOGLU, N
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1992, 40 (01) : 202 - 210
  • [35] A NOVEL FAST FEEDFORWARD NEURAL NETWORKS TRAINING ALGORITHM
    Bilski, Jaroslaw
    Kowalczyk, Bartosz
    Marjanski, Andrzej
    Gandor, Michal
    Zurada, Jacek
    [J]. JOURNAL OF ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING RESEARCH, 2021, 11 (04) : 287 - 306
  • [36] The adaptive fuzzy training algorithm for feedforward neural networks
    Xie, P.
    Liu, B.
    [J]. Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2001, 23 (07): : 79 - 82
  • [37] Successive approximation training algorithm for feedforward neural networks
    Liang, YC
    Feng, DP
    Lee, HP
    Lim, SP
    Lee, KH
    [J]. NEUROCOMPUTING, 2002, 42 : 311 - 322
  • [38] A constructive algorithm for feedforward neural networks with incremental training
    Liu, DR
    Chang, TS
    Zhang, Y
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, 2002, 49 (12) : 1876 - 1879
  • [39] Evolutional design and training algorithm for feedforward neural networks
    Takahashi, H
    Nakajima, M
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1999, E82D (10) : 1384 - 1392
  • [40] A global optimization algorithm for training feedforward neural networks
    Huanqin Li
    [J]. DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2006, 13 : 846 - 849