A modified error function to improve the error back-propagation algorithm for multi-layer perceptrons

被引:2
|
作者
Oh, Sang-Hoon [1 ]
Lee, Youngjik [2 ]
机构
[1] Pusan Nationual University, Pusan, Korea, Republic of
[2] Seoul National University, Seoul, Korea, Republic of
关键词
Errors - Learning algorithms - Mean square error - Classification (of information) - Character recognition;
D O I
10.4218/etrij.95.0195.0012
中图分类号
学科分类号
摘要
This paper proposes a modified error function to improve the error backpropagation (EBP) algorithmfor multi-Layer perceptrons (MLPs) which suffers from slow learning speed. It can also suppress overspecialization for training patterns that occurs in an algorithm based on a crossentropy cost function which markedly reduces learning time. In the similar way as the cross-entropy function, our new function accelerates the learning speed of the EBP algorithm by allowing the output node of the MLP to generate a strong error signal when the output node is far from the desired value. Moreover, it prevents the overspecialization of learning for training patterns by letting the output node, whose value is close to the desired value, generate a weak error signal. In a simulation study to classify handwritten digits in the CEDAR [1] database, the proposed method attained 100% correct classification for the training patterns after only 50 sweeps of learning, while the original EBP attained only 98.8% after 500 sweeps. Also, our method shows mean-squared error of 0.627 for the test patterns, which is superior to the error 0.667 in the cross-entropy method. These results demonstrate that our new method excels others in learning speed as well as in generalization.
引用
收藏
页码:11 / 22
相关论文
共 50 条
  • [41] A modified error function for the backpropagation algorithm
    Wang, XG
    Tang, Z
    Tamura, H
    Ishii, M
    NEUROCOMPUTING, 2004, 57 : 477 - 484
  • [42] The recognition of lymph node metastasis malignancy tumor cells based on the multi-layer error back propagation (BP) neural network
    Kong, L
    Liu, CP
    Shen, PH
    Xia, DS
    NEURAL NETWORK AND DISTRIBUTED PROCESSING, 2001, 4555 : 57 - 64
  • [43] Echo Canceller Using Error Back Propagation Algorithm
    Roy, Soumava Kumar
    Rodrigues, Crefeda Faviola
    2014 INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE ISCMI 2014, 2014, : 98 - 101
  • [44] River flow prediction using artificial neural networks: Self-adaptive error back-propagation algorithm
    Qin, Guang-Hua
    Ding, Jing
    Liu, Guo-Dong
    Shuikexue Jinzhan/Advances in Water Science, 2002, 13 (01): : 37 - 41
  • [45] Direct Givens rotation method based on error back-propagation algorithm for self-consistent field solution
    Oshima, Rei
    Nakai, Hiromi
    JOURNAL OF CHEMICAL PHYSICS, 2025, 162 (01):
  • [46] Design of multi-layer perceptrons via joint filled function and genetic algorithm approach for video forensics
    Ho, Charlotte Yuk-Fan
    Ling, Bingo Wing-Kuen
    2016 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS-CHINA (ICCE-CHINA), 2016,
  • [47] Accelerating the learning speed of multilayer perceptrons with a new error function for hidden layer
    Oh, SH
    Lee, SY
    ICONIP'98: THE FIFTH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING JOINTLY WITH JNNS'98: THE 1998 ANNUAL CONFERENCE OF THE JAPANESE NEURAL NETWORK SOCIETY - PROCEEDINGS, VOLS 1-3, 1998, : 1008 - 1011
  • [48] Modified back-propagation algorithm applied to decision-feedback equalisation
    Lee, C. -M.
    Yang, S. -S.
    Ho, C. L.
    IEE PROCEEDINGS-VISION IMAGE AND SIGNAL PROCESSING, 2006, 153 (06): : 805 - 809
  • [49] AVLR-EBP: A Variable Step Size Approach to Speed-up the Convergence of Error Back-Propagation Algorithm
    Arman Didandeh
    Nima Mirbakhsh
    Ali Amiri
    Mahmood Fathy
    Neural Processing Letters, 2011, 33 : 201 - 214
  • [50] AVLR-EBP: A Variable Step Size Approach to Speed-up the Convergence of Error Back-Propagation Algorithm
    Didandeh, Arman
    Mirbakhsh, Nima
    Amiri, Ali
    Fathy, Mahmood
    NEURAL PROCESSING LETTERS, 2011, 33 (02) : 201 - 214