Fast Convergent Generalized Back-Propagation Algorithm with Constant Learning Rate

被引:0
|
作者
S.C. Ng
S.H. Leung
A. Luk
机构
[1] Hong Kong Technical College,Department of Computing and Mathematics
[2] City University of Hong Kong,Department of Electronic Engineering
来源
关键词
generalized back-propagation; gradient descent algorithm; feedforward neural networks; convergence; constant learning rate;
D O I
暂无
中图分类号
学科分类号
摘要
The conventional back-propagation algorithm is basically a gradient-descent method, it has the problems of local minima and slow convergence. A new generalized back-propagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima is introduced. The new back-propagation algorithm is to change the derivative of the activation function so as to magnify the backward propagated error signal, thus the convergence rate can be accelerated and the local minimum can be escaped. In this letter, we also investigate the convergence of the generalized back-propagation algorithm with constant learning rate. The weight sequences in generalized back-propagation algorithm can be approximated by a certain ordinary differential equation (ODE). When the learning rate tends to zero, the interpolated weight sequences of generalized back-propagation converge weakly to the solution of associated ODE.
引用
收藏
页码:13 / 23
页数:10
相关论文
共 50 条
  • [1] Fast convergent generalized back-propagation algorithm with constant learning rate
    Ng, SC
    Leung, SH
    Luk, A
    NEURAL PROCESSING LETTERS, 1999, 9 (01) : 13 - 23
  • [2] Convergence of the generalized back-propagation algorithm with constant learning rates
    Ng, SC
    Leung, SH
    Luk, A
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 1090 - 1094
  • [3] A complement to the back-propagation algorithm: An upper bound for the learning rate
    Cerqueira, JJF
    Palhares, AGB
    Madrid, MK
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL IV, 2000, : 517 - 522
  • [4] The generalized back-propagation algorithm with convergence analysis
    Ng, SC
    Leung, SH
    Luk, A
    ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 612 - +
  • [5] A generalized back-propagation algorithm for faster convergence
    Ng, SC
    Leung, SH
    Luk, A
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 409 - 413
  • [6] Error measures of the back-propagation learning algorithm
    Fujiki, S
    Nakao, M
    Fujiki, NM
    JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 2002, 40 (06) : 1091 - 1095
  • [7] AN ACCELERATED ERROR BACK-PROPAGATION LEARNING ALGORITHM
    MAKRAMEBEID, S
    SIRAT, JA
    VIALA, JR
    PHILIPS JOURNAL OF RESEARCH, 1990, 44 (06) : 521 - 540
  • [8] Generalized back-propagation algorithm with weight evolution for neural networks
    Ng, SC
    Leung, SH
    Luk, A
    6TH WORLD MULTICONFERENCE ON SYSTEMICS, CYBERNETICS AND INFORMATICS, VOL XX, PROCEEDINGS EXTENSION, 2002, : 41 - 44
  • [9] A Novel Learning Algorithm of Back-propagation Neural Network
    Gong, Bing
    2009 IITA INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS ENGINEERING, PROCEEDINGS, 2009, : 411 - 414
  • [10] Research the conditions of convergence of back-propagation learning algorithm
    Jeenbekov, AA
    Sarybaeva, AA
    OPTOELECTRONIC AND HYBRID OPTICAL/DIGITAL SYSTEMS FOR IMAGE AND SIGNAL PROCESSING, 2000, 4148 : 12 - 18