A modified error function to improve the error back-propagation algorithm for multi-layer perceptrons

被引:2
|
作者
Oh, Sang-Hoon [1 ]
Lee, Youngjik [2 ]
机构
[1] Pusan Nationual University, Pusan, Korea, Republic of
[2] Seoul National University, Seoul, Korea, Republic of
关键词
Errors - Learning algorithms - Mean square error - Classification (of information) - Character recognition;
D O I
10.4218/etrij.95.0195.0012
中图分类号
学科分类号
摘要
This paper proposes a modified error function to improve the error backpropagation (EBP) algorithmfor multi-Layer perceptrons (MLPs) which suffers from slow learning speed. It can also suppress overspecialization for training patterns that occurs in an algorithm based on a crossentropy cost function which markedly reduces learning time. In the similar way as the cross-entropy function, our new function accelerates the learning speed of the EBP algorithm by allowing the output node of the MLP to generate a strong error signal when the output node is far from the desired value. Moreover, it prevents the overspecialization of learning for training patterns by letting the output node, whose value is close to the desired value, generate a weak error signal. In a simulation study to classify handwritten digits in the CEDAR [1] database, the proposed method attained 100% correct classification for the training patterns after only 50 sweeps of learning, while the original EBP attained only 98.8% after 500 sweeps. Also, our method shows mean-squared error of 0.627 for the test patterns, which is superior to the error 0.667 in the cross-entropy method. These results demonstrate that our new method excels others in learning speed as well as in generalization.
引用
收藏
页码:11 / 22
相关论文
共 50 条
  • [21] A local supervised learning algorithm for multi-layer perceptrons
    Vlachos, DS
    ICNAAM 2004: INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2004, 2004, : 452 - 454
  • [22] An approximate algorithm for reverse engineering of multi-layer perceptrons
    Kowalczyk, W
    RESEARCH AND DEVELOPMENT IN INTELLIGENT SYSTEMS XX, 2004, : 55 - 66
  • [23] Frequency-based error back-propagation in a cortical network
    Bogacz, R
    Brown, MW
    Giraud-Carrier, C
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL II, 2000, : 211 - 216
  • [24] Classification of Premature Ventricular Contraction using Error Back-Propagation
    Jeon, Eunkwang
    Jung, Bong-Keun
    Nam, Yunyoung
    Lee, HwaMin
    KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2018, 12 (02): : 988 - 1001
  • [25] Back Propagation Algoritham: The Best Algorithm Among the Multi-layer Perceptron Algorithm
    Alsmadi, Mutasem Khalil Sari
    Bin Omar, Khairuddin
    Noah, Shahrul Azman
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2009, 9 (04): : 378 - 383
  • [26] Driving habits analysis on vehicle data using error back-propagation neural network algorithm
    Zhang, Wen Jing
    Yu, Shu Xi
    Peng, Yun Feng
    Cheng, Zi Jing
    Wang, Chong
    COMPUTING, CONTROL, INFORMATION AND EDUCATION ENGINEERING, 2015, : 55 - 58
  • [27] Temporal correction of multiple neuronal spike trains using the back-propagation error correction algorithm
    Tam, D.C.
    Perkel, D.H.
    Tucker, W.S.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [28] Solving flat-spot problem in back-propagation learning algorithm based on magnified error
    Yang, B
    Wang, YD
    Su, XH
    Wang, LJ
    PROCEEDINGS OF THE 2004 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2004, : 1784 - 1788
  • [29] Improving the error backpropagation algorithm with a modified error function
    Oh, SH
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (03): : 799 - 803
  • [30] Improving the error backpropagation algorithm with a modified error function
    Electronics and Telecommunications, Research Inst, Taejon, Korea, Republic of
    IEEE Trans Neural Networks, 3 (799-803):