Learning in fully recurrent neural networks by approaching tangent planes to constraint surfaces

被引:7
|
作者
May, P. [2 ]
Zhou, E. [1 ]
Lee, C. W. [1 ]
机构
[1] Univ Bolton, Fac Adv Engn & Sci, Bolton BL3 5AB, England
[2] K Coll, Tonbridge TN9 2PW, Kent, England
关键词
Real time recurrent learning; Accelerated; Local minimum; Speed; Temporal pattern recognition; Henon map; Non-linear process plant; ALGORITHM;
D O I
10.1016/j.neunet.2012.06.011
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present a new variant of the online real time recurrent learning algorithm proposed by Williams and Zipser (1989). Whilst the original algorithm utilises gradient information to guide the search towards the minimum training error, it is very slow in most applications and often gets stuck in local minima of the search space. It is also sensitive to the choice of learning rate and requires careful tuning. The new variant adjusts weights by moving to the tangent planes to constraint surfaces. It is simple to implement and requires no parameters to be set manually. Experimental results show that this new algorithm gives significantly faster convergence whilst avoiding problems like local minima. (C) 2012 Elsevier Ltd. All rights reserved.
引用
收藏
页码:72 / 79
页数:8
相关论文
共 50 条
  • [41] Data Symmetries and Learning in Fully Connected Neural Networks
    Anselmi, Fabio
    Manzoni, Luca
    D'onofrio, Alberto
    Rodriguez, Alex
    Caravagna, Giulio
    Bortolussi, Luca
    Cairoli, Francesca
    IEEE ACCESS, 2023, 11 : 47282 - 47290
  • [42] Learning algorithms and the shape of the learning surface in recurrent neural networks
    Watanabe, Tatsumi
    Uchikawa, Yoshiki
    Gouhara, Kazutoshi
    Systems and Computers in Japan, 1992, 23 (13): : 90 - 107
  • [43] Training spatially homogeneous fully recurrent neural networks in eigenvalue space
    Perfetti, R
    Massarelli, E
    NEURAL NETWORKS, 1997, 10 (01) : 125 - 137
  • [44] Semantic Segmentation of RGBD Videos with Recurrent Fully Convolutional Neural Networks
    Yurdakul, Ekrem Emre
    Yemez, Yucel
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, : 367 - 374
  • [45] Behavioral modeling of power amplifiers using fully recurrent neural networks
    Luongvinh, D
    Kwon, Y
    2005 IEEE MTT-S International Microwave Symposium, Vols 1-4, 2005, : 1979 - 1982
  • [46] Bipolar fully recurrent deep structured neural learning based attack detection for securing industrial sensor networks
    Alzubi, Jafar A.
    TRANSACTIONS ON EMERGING TELECOMMUNICATIONS TECHNOLOGIES, 2021, 32 (07)
  • [47] On the improvement of the real time recurrent learning algorithm for recurrent neural networks
    Mak, MW
    Ku, KW
    Lu, YL
    NEUROCOMPUTING, 1999, 24 (1-3) : 13 - 36
  • [48] DeepSaDe: Learning Neural Networks That Guarantee Domain Constraint Satisfaction
    Goyal, Kshitij
    Dumancic, Sebastijan
    Blockeel, Hendrik
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 11, 2024, : 12199 - 12207
  • [49] Accelerating parallel tangent learning for neural networks through dynamic self adaptation
    Moallem, P
    Faez, K
    IEEE TENCON'97 - IEEE REGIONAL 10 ANNUAL CONFERENCE, PROCEEDINGS, VOLS 1 AND 2: SPEECH AND IMAGE TECHNOLOGIES FOR COMPUTING AND TELECOMMUNICATIONS, 1997, : 375 - 378
  • [50] On the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspective
    Srivastava, Shivin
    Bhatia, Ashutosh
    2018 9TH IEEE INTERNATIONAL CONFERENCE ON BIG KNOWLEDGE (ICBK), 2018, : 162 - 167