A modified error function to improve the error back-propagation algorithm for multi-layer perceptrons

被引:2
|
作者
Oh, Sang-Hoon [1 ]
Lee, Youngjik [2 ]
机构
[1] Pusan Nationual University, Pusan, Korea, Republic of
[2] Seoul National University, Seoul, Korea, Republic of
关键词
Errors - Learning algorithms - Mean square error - Classification (of information) - Character recognition;
D O I
10.4218/etrij.95.0195.0012
中图分类号
学科分类号
摘要
This paper proposes a modified error function to improve the error backpropagation (EBP) algorithmfor multi-Layer perceptrons (MLPs) which suffers from slow learning speed. It can also suppress overspecialization for training patterns that occurs in an algorithm based on a crossentropy cost function which markedly reduces learning time. In the similar way as the cross-entropy function, our new function accelerates the learning speed of the EBP algorithm by allowing the output node of the MLP to generate a strong error signal when the output node is far from the desired value. Moreover, it prevents the overspecialization of learning for training patterns by letting the output node, whose value is close to the desired value, generate a weak error signal. In a simulation study to classify handwritten digits in the CEDAR [1] database, the proposed method attained 100% correct classification for the training patterns after only 50 sweeps of learning, while the original EBP attained only 98.8% after 500 sweeps. Also, our method shows mean-squared error of 0.627 for the test patterns, which is superior to the error 0.667 in the cross-entropy method. These results demonstrate that our new method excels others in learning speed as well as in generalization.
引用
收藏
页码:11 / 22
相关论文
共 50 条
  • [31] An improved dynamic sampling back-propagation algorithm based on mean square error to face the multi-class imbalance problem
    R. Alejo
    J. Monroy-de-Jesús
    J. C. Ambriz-Polo
    J. H. Pacheco-Sánchez
    Neural Computing and Applications, 2017, 28 : 2843 - 2857
  • [32] Dynamic Learning Algorithm of Multi-Layer Perceptrons for Letter Recognition
    Feng, Qin
    Gao Daqi
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [33] An improved dynamic sampling back-propagation algorithm based on mean square error to face the multi-class imbalance problem
    Alejo, R.
    Monroy-de-Jesus, J.
    Ambriz-Polo, J. C.
    Pacheco-Sanchez, J. H.
    NEURAL COMPUTING & APPLICATIONS, 2017, 28 (10): : 2843 - 2857
  • [34] Preorganized neural networks: Error back-propagation learning of manipulator dynamics
    Tsuji, Toshio, 1600, Ablex Publ Corp, Norwood, NJ, United States (02): : 1 - 2
  • [35] Areas where error back-propagation and Kohonen networks touch.
    Zupan, J
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 1997, 214 : 27 - CINF
  • [36] Can supervised learning be achieved without explicit error back-propagation?
    Brandt, RD
    Lin, F
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 300 - 305
  • [37] Effect of the Phase-Error of Local Oscillators in Digital Back-Propagation
    Park, Sang-Gyu
    IEEE PHOTONICS TECHNOLOGY LETTERS, 2015, 27 (04) : 363 - 366
  • [38] A ROBUST BACK-PROPAGATION LEARNING ALGORITHM FOR FUNCTION APPROXIMATION
    CHEN, DS
    JAIN, RC
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (03): : 467 - 479
  • [39] Study on the Application of Error Back-Propagation Algorithm Applied to the Student Status Management in Higher Education Institutions
    Yang, Xinxiu
    INTERNATIONAL JOURNAL OF INFORMATION AND COMMUNICATION TECHNOLOGY EDUCATION, 2024, 20 (01)
  • [40] A novel prediction algorithm of dr position error based on bayesian regularization back-propagation neural network
    Ju, T. (cqtangju@vip.sina.com), 1600, Universitas Ahmad Dahlan, Jalan Kapas 9, Semaki, Umbul Harjo,, Yogiakarta, 55165, Indonesia (11):