A modified error function to improve the error back-propagation algorithm for multi-layer perceptrons

被引:2
|
作者
Oh, Sang-Hoon [1 ]
Lee, Youngjik [2 ]
机构
[1] Pusan Nationual University, Pusan, Korea, Republic of
[2] Seoul National University, Seoul, Korea, Republic of
关键词
Errors - Learning algorithms - Mean square error - Classification (of information) - Character recognition;
D O I
10.4218/etrij.95.0195.0012
中图分类号
学科分类号
摘要
This paper proposes a modified error function to improve the error backpropagation (EBP) algorithmfor multi-Layer perceptrons (MLPs) which suffers from slow learning speed. It can also suppress overspecialization for training patterns that occurs in an algorithm based on a crossentropy cost function which markedly reduces learning time. In the similar way as the cross-entropy function, our new function accelerates the learning speed of the EBP algorithm by allowing the output node of the MLP to generate a strong error signal when the output node is far from the desired value. Moreover, it prevents the overspecialization of learning for training patterns by letting the output node, whose value is close to the desired value, generate a weak error signal. In a simulation study to classify handwritten digits in the CEDAR [1] database, the proposed method attained 100% correct classification for the training patterns after only 50 sweeps of learning, while the original EBP attained only 98.8% after 500 sweeps. Also, our method shows mean-squared error of 0.627 for the test patterns, which is superior to the error 0.667 in the cross-entropy method. These results demonstrate that our new method excels others in learning speed as well as in generalization.
引用
收藏
页码:11 / 22
相关论文
共 50 条
  • [1] Detection of EEG sleep spindles using back-propagation multi-layer perceptrons
    Ventouras, E
    Monoyiou, E
    Ktonas, PY
    Paparrigopoulos, T
    Dikeos, DG
    Uzunoglu, N
    Soldatos, CR
    MEDICON 2001: PROCEEDINGS OF THE INTERNATIONAL FEDERATION FOR MEDICAL & BIOLOGICAL ENGINEERING, PTS 1 AND 2, 2001, : 380 - 381
  • [2] Error measures of the back-propagation learning algorithm
    Fujiki, S
    Nakao, M
    Fujiki, NM
    JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 2002, 40 (06) : 1091 - 1095
  • [3] AN ACCELERATED ERROR BACK-PROPAGATION LEARNING ALGORITHM
    MAKRAMEBEID, S
    SIRAT, JA
    VIALA, JR
    PHILIPS JOURNAL OF RESEARCH, 1990, 44 (06) : 521 - 540
  • [4] Compressive Learning of Multi-layer Perceptrons: An Error Analysis
    Kaban, Ata
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [5] IMPROVING BACK-PROPAGATION WITH A NEW ERROR FUNCTION
    HUMPERT, BK
    NEURAL NETWORKS, 1994, 7 (08) : 1191 - 1192
  • [6] Error back-propagation algorithm for classification of imbalanced data
    Oh, Sang-Hoon
    NEUROCOMPUTING, 2011, 74 (06) : 1058 - 1061
  • [7] Acceleration by prediction for error back-propagation algorithm of neural network
    Kanda, Arihiro, 1600, Publ by Scripta Technica Inc, New York, NY, United States (25):
  • [8] Methods to speed up error back-propagation learning algorithm
    Sarkar, D
    ACM COMPUTING SURVEYS, 1995, 27 (04) : 519 - 542
  • [9] Theories of Error Back-Propagation in the Brain
    Whittington, James C. R.
    Bogacz, Rafal
    TRENDS IN COGNITIVE SCIENCES, 2019, 23 (03) : 235 - 250
  • [10] Error back-propagation in multi-valued logic systems
    Apostolikas, Georgios
    Konstantopoulos, Stasinos
    ICCIMA 2007: INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND MULTIMEDIA APPLICATIONS, VOL IV, PROCEEDINGS, 2007, : 207 - 213