A conjugate gradient learning algorithm for recurrent neural networks

被引:15
|
作者
Chang, WF [1 ]
Mak, MW [1 ]
机构
[1] Hong Kong Polytech Univ, Dept Elect & Informat Engn, Hunghom, Peoples R China
关键词
recurrent neural networks; real time recurrent learning; conjugate gradient;
D O I
10.1016/S0925-2312(98)00104-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The real-time recurrent learning (RTRL) algorithm, which is originally proposed for training recurrent neural networks, requires a large number of iterations for convergence because a small learning rate should be used. While an obvious solution to this problem is to use a large learning rate, this could result in undesirable convergence characteristics. This paper attempts to improve the convergence capability and convergence characteristics of the RTRL algorithm by incorporating conjugate gradient computation into its learning procedure. The resulting algorithm, referred to as the conjugate gradient recurrent learning (CGRL) algorithm, is applied to train fully connected recurrent neural networks to simulate a second-order low-pass filter and to predict the chaotic intensity pulsations of NH, laser. Results show that the CGRL algorithm exhibits substantial improvement in convergence (in terms of the reduction in mean squared error per epoch) as compared to the RTRL and batch mode RTRL algorithms. (C) 1999 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:173 / 189
页数:17
相关论文
共 50 条
  • [31] A Learning Algorithm for Continually Running Fully Recurrent Neural Networks
    Williams, Ronald J.
    Zipser, David
    NEURAL COMPUTATION, 1989, 1 (02) : 270 - 280
  • [32] An Unsupervised Learning Algorithm for Deep Recurrent Spiking Neural Networks
    Du, Pangao
    Lin, Xianghong
    Pi, Xiaomei
    Wang, Xiangwen
    2020 11TH IEEE ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2020, : 603 - 607
  • [33] Stable Learning Algorithm Using Reducibility for Recurrent Neural Networks
    Satoh, Seiya
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 127 - 139
  • [34] Robust stability of recurrent neural networks with ISS learning algorithm
    Ahn, Choon Ki
    NONLINEAR DYNAMICS, 2011, 65 (04) : 413 - 419
  • [35] Robust stability of recurrent neural networks with ISS learning algorithm
    Choon Ki Ahn
    Nonlinear Dynamics, 2011, 65 : 413 - 419
  • [36] A new conjugate gradient algorithm for training neural networks based on a modified secant equation
    Livieris, Ioannis E.
    Pintelas, Panagiotis
    APPLIED MATHEMATICS AND COMPUTATION, 2013, 221 : 491 - 502
  • [37] Conjugate gradient algorithm suitable for BP learning
    Tian, Jun
    Yu, Juebang
    Yang, Can
    Dianzi Keji Daxue Xuebao/Journal of University of Electronic Science and Technology of China, 1994, 23 (05):
  • [38] Temporal differences learning with the Conjugate Gradient algorithm
    Falas, T
    Stafylopatis, AG
    IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 171 - 176
  • [39] LEARNING IN RECURRENT NEURAL NETWORKS
    WHITE, H
    MATHEMATICAL SOCIAL SCIENCES, 1991, 22 (01) : 102 - 103
  • [40] Learning in neural networks by normalized stochastic gradient algorithm: Local convergence
    Tadic, V
    Stankovic, S
    NEUREL 2000: PROCEEDINGS OF THE 5TH SEMINAR ON NEURAL NETWORK APPLICATIONS IN ELECTRICAL ENGINEERING, 2000, : 11 - 17