Inequalities of generalization errors for layered neural networks in Bayesian learning

被引:0
|
作者
Watanabe, S [1 ]
机构
[1] Tokyo Inst Technol, Precis & Intelligence Lab, Midori Ku, Yokohama, Kanagawa 2268503, Japan
关键词
Bayesian learning; layered neural networks; generalization error;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proves inequalities of generalization errors for layered neural networks in Bayesian learning. It is shown that, if a three layer perceptron with M input units, H hidden units, and N output units is trained to learn the true model with H-1 hidden units, the generalization error is smaller than D/(2n) where D is the number of parameters and n is the number of training samples.
引用
收藏
页码:59 / 62
页数:4
相关论文
共 50 条
  • [1] A STATISTICAL APPROACH TO LEARNING AND GENERALIZATION IN LAYERED NEURAL NETWORKS
    LEVIN, E
    TISHBY, N
    SOLLA, SA
    PROCEEDINGS OF THE IEEE, 1990, 78 (10) : 1568 - 1574
  • [2] LEARNING AND GENERALIZATION IN LAYERED NEURAL NETWORKS - THE CONTIGUITY PROBLEM
    SOLLA, SA
    NEURAL NETWORKS FROM MODELS TO APPLICATIONS, 1989, : 168 - 177
  • [3] Training and Generalization Errors for Underparameterized Neural Networks
    Martin Xavier, Daniel
    Chamoin, Ludovic
    Fribourg, Laurent
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 3926 - 3931
  • [4] Generalization error of three layered learning model in Bayesian estimation
    Aoyagi, Miki
    Watanabe, Sumio
    PROCEEDINGS OF THE SECOND IASTED INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE, 2006, : 405 - +
  • [5] H∞-learning of layered neural networks
    Nishiyama, K
    Suzuki, K
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (06): : 1265 - 1277
  • [6] Bayesian learning for recurrent neural networks
    Crucianu, M
    Boné, R
    de Beauville, JPA
    NEUROCOMPUTING, 2001, 36 (01) : 235 - 242
  • [7] Learning contiguity with layered neural networks
    Solla, Sara A.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [8] Attentive Learning Facilitates Generalization of Neural Networks
    Lei, Shiye
    He, Fengxiang
    Chen, Haowen
    Tao, Dacheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 3329 - 3342
  • [9] Bayesian model comparison versus generalization ability of neural networks
    Gomari, M
    Järvi, T
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL I AND II, 1999, : 537 - 541
  • [10] Learning Sparse Neural Networks for Better Generalization
    Liu, Shiwei
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 5190 - 5191