An efficient learning algorithm with second-order convergence for multilayer neural networks

被引:0
|
作者
Ninomiya, H [1 ]
Tomita, C [1 ]
Asai, H [1 ]
机构
[1] Shonan Inst Technol, Fac Engn, Dept Informat Sci, Fujisawa, Kanagawa 2518511, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes an efficient second-order algorithm for learning of the multilayer neural networks with widely and stable convergent properties. First, the algorithm based on iterative formula of the steepest descent method, which is "implicitly" employed, is introduced. We show the equivalent property between the Gauss-Newton(GN) method and the "implicit" steepest descent(ISD) method. This means that ISD method satisfy the desired targets by simultaneously combining the merits of the GN and SD techniques in order to enhance the very good properties of SD method. Next, we propose very powerful algorithm for learning multilayer feedforward neural networks, called "implicit" steepest descent with momentum(ISDM) method and show the analogy with the trapezoidal formula in the field of numerical analysis. Finally, the proposed algorithms are compared with GN method for training multilayer neural networks through the computer simulations.
引用
收藏
页码:2028 / 2032
页数:5
相关论文
共 50 条
  • [21] A Constrained Approximation Algorithm by Encoding Second-Order Derivative Information into Feedforward Neural Networks
    Ling, Qing-Hua
    Han, Fei
    EMERGING INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS: WITH ASPECTS OF ARTIFICIAL INTELLIGENCE, 2009, 5755 : 928 - +
  • [22] A neural network algorithm for second-order conic programming
    Mu, XW
    Liu, SY
    Zhang, YL
    ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 718 - 724
  • [23] Evapotranspiration Modeling Using Second-Order Neural Networks
    Adamala, Sirisha
    Raghuwanshi, N. S.
    Mishra, Ashok
    Tiwari, Mukesh K.
    JOURNAL OF HYDROLOGIC ENGINEERING, 2014, 19 (06) : 1131 - 1140
  • [24] Generalization of Neural Networks on Second-Order Hypercomplex Numbers
    Pavlov, Stanislav
    Kozlov, Dmitry
    Bakulin, Mikhail
    Zuev, Aleksandr
    Latyshev, Andrey
    Beliaev, Alexander
    MATHEMATICS, 2023, 11 (18)
  • [25] An Efficient and Effective Second-Order Training Algorithm for LSTM-Based Adaptive Learning
    Vural, N. Mert
    Ergut, Salih
    Kozat, Suleyman S.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 2541 - 2554
  • [26] Multistable learning dynamics in second-order neural networks with time-varying delays
    Huang, Zhenkun
    Feng, Chunhua
    Mohamad, Sannay
    Ye, Jinglong
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2011, 88 (07) : 1327 - 1346
  • [27] Convergence of Physics-Informed Neural Networks Applied to Linear Second-Order Elliptic Interface Problems
    Wu, Sidi
    Zhu, Aiqing
    Tang, Yifa
    Lu, Benzhuo
    COMMUNICATIONS IN COMPUTATIONAL PHYSICS, 2023, 33 (02) : 596 - 627
  • [28] Second-order learning algorithm with squared penalty term
    Saito, K
    Nakano, R
    NEURAL COMPUTATION, 2000, 12 (03) : 709 - 729
  • [29] Second-order learning algorithm with squared penalty term
    Saito, K
    Nakano, R
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 9: PROCEEDINGS OF THE 1996 CONFERENCE, 1997, 9 : 627 - 633
  • [30] Convergence of rotating consensus algorithm for second-order dynamics in three dimensions
    Wang, Yintao
    Sun, Qi
    TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2015, 37 (09) : 1127 - 1134