IMPROVING GENERALIZATION PERFORMANCE USING DOUBLE BACKPROPAGATION

被引:139
|
作者
DRUCKER, H
LECUN, Y
机构
[1] MONMOUTH COLL, LONG BRANCH, NJ 07764 USA
[2] AT&T BELL LABS, HOLMDEL, NJ 07733 USA
来源
关键词
D O I
10.1109/72.165600
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In order to generalize from a training set to a test set, it is desirable that small changes in the input space of a pattern do not change the output components. This can be done by including variations of the input space as part of the training set-but this is computationally very expensive. Another method is to force this behavior as part of the training algorithm. This is done in double backpropagation by forming an energy function that is the sum of the normal energy term found in backpropagation and an additional term that is a function of the Jacobian. Significant improvement is shown with different architectures and different test sets, especially with architectures that had previously been shown to have very good performance when trained using backpropagation. It also will be shown that double backpropagation, as compared to backpropagation, creates weights that are smaller thereby causing the output of the neurons to spend more time in the linear region.
引用
收藏
页码:991 / 997
页数:7
相关论文
共 50 条
  • [1] The effects of adding noise during backpropagation training on a generalization performance
    An, GZ
    [J]. NEURAL COMPUTATION, 1996, 8 (03) : 643 - 674
  • [2] IMPROVING GENERALIZATION PERFORMANCE BY INFORMATION MINIMIZATION
    KAMIMURA, R
    TAKAGI, T
    NAKANISHI, S
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 1995, E78D (02) : 163 - 173
  • [3] IMPROVING BACKPROPAGATION LEARNING USING AUXILIARY NEURAL NETWORKS
    FORTUNA, L
    GRAZIANI, S
    LOPRESTI, M
    MUSCATO, G
    [J]. INTERNATIONAL JOURNAL OF CONTROL, 1992, 55 (04) : 793 - 807
  • [4] Improving generalization performance of natural gradient learning using optimized regularization by NIC
    Park, H
    Murata, N
    Amari, S
    [J]. NEURAL COMPUTATION, 2004, 16 (02) : 355 - 382
  • [5] Improving the generalization performance of RBF neural networks using a linear regression technique
    Lin, C. L.
    Wang, J. F.
    Chen, C. Y.
    Chen, C. W.
    Yen, C. W.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2009, 36 (10) : 12049 - 12053
  • [6] GENERALIZATION PERFORMANCE USING BACKPROPAGATION ALGORITHMS APPLIED TO PATTERNS DERIVED FROM TUMOR H-1-NMR SPECTRA
    BRANSTON, NM
    MAXWELL, RJ
    HOWELLS, SL
    [J]. JOURNAL OF MICROCOMPUTER APPLICATIONS, 1993, 16 (02): : 113 - 123
  • [7] Improving generalization performance of electrocardiogram classification models
    Han, Hyeongrok
    Park, Seongjae
    Min, Seonwoo
    Kim, Eunji
    Kim, HyunGi
    Park, Sangha
    Kim, Jin-Kook
    Park, Junsang
    An, Junho
    Lee, Kwanglo
    Jeong, Wonsun
    Chon, Sangil
    Ha, Kwon-Woo
    Han, Myungkyu
    Choi, Hyun-Soo
    Yoon, Sungroh
    [J]. PHYSIOLOGICAL MEASUREMENT, 2023, 44 (05)
  • [8] Improving the Performance of Multilayer Backpropagation Neural Networks with Adaptive Leaning Rate
    Amiri, Zahra
    Hassanpour, Hamid
    Khan, N. Mamode
    Khan, M. Heenaye Mamode
    [J]. 2018 INTERNATIONAL CONFERENCE ON ADVANCES IN BIG DATA, COMPUTING AND DATA COMMUNICATION SYSTEMS (ICABCD), 2018,
  • [9] STOCHASTIC BACKPROPAGATION - A LEARNING ALGORITHM FOR GENERALIZATION PROBLEMS
    RAMAMOORTHY, CV
    SHEKHAR, S
    [J]. PROCEEDINGS : THE THIRTEENTH ANNUAL INTERNATIONAL COMPUTER SOFTWARE & APPLICATIONS CONFERENCE, 1989, : 664 - 671
  • [10] Improving generalization of double low-rank representation using Schatten- p norm
    Zhao, Jiaoyan
    Liang, Yongsheng
    Yi, Shuangyan
    Shen, Qiangqiang
    Cao, Xiaofeng
    [J]. PATTERN RECOGNITION, 2023, 138