Stacked generalization in neural networks: Generalization on statistically neutral problems

被引:0
|
作者
Ghorbani, AA [1 ]
Owrangh, K [1 ]
机构
[1] Univ New Brunswick, Fac Comp Sci, Fredericton, NB E3B 5A3, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generalization continues to be one of the most important topic in neural networks and other classifiers. In the last number of years, number of different methods have been developed to improve generalization accuracy. Any classifier that uses induction to find the class concept from the training patterns will have hard time to achieve acceptable level of generalization accuracy when the problem to be learned is a statistically neutral problem [4, 7]. A problem is statistically neutral if the probability of mapping an input onto an output is always the chance value of 0.5. In this paper, we examine the generalization behaviour of the multilayer neural networks on learning statistically neutral problems using single level learning models (eg., conventional cross-validation scheme) as well as multiple level learning models (eg., Stacked generalization method). We show that for statistically neutral problems such as parity and majority function, Stacked generalization scheme improves classification performance and generalization accuracy over single level cross-validation model.
引用
收藏
页码:1715 / 1720
页数:6
相关论文
共 50 条
  • [31] GENERALIZATION IN 2-LAYER NEURAL NETWORKS
    KANG, K
    OH, JH
    KWON, C
    PARK, Y
    [J]. JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 1993, 26 : S396 - S398
  • [32] On Generalization Bounds of a Family of Recurrent Neural Networks
    Chen, Minshuo
    Li, Xingguo
    Zhao, Tuo
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 1233 - 1242
  • [33] GENERALIZATION IN HIGHER-ORDER NEURAL NETWORKS
    YOUNG, S
    DOWNS, T
    [J]. ELECTRONICS LETTERS, 1993, 29 (16) : 1491 - 1493
  • [34] Generalization capabilities of translationally equivariant neural networks
    Bulusu, Srinath
    Favoni, Matteo
    Ipp, Andreas
    Mueller, David, I
    Schuh, Daniel
    [J]. PHYSICAL REVIEW D, 2021, 104 (07)
  • [35] Generalization and selection of examples in feedforward neural networks
    Franco, L
    Cannas, SA
    [J]. NEURAL COMPUTATION, 2000, 12 (10) : 2405 - 2426
  • [36] Ecological Neural Networks for Object Recognition and Generalization
    Raffaele Calabretta
    Andrea Di Ferdinando
    Domenico Parisi
    [J]. Neural Processing Letters, 2004, 19 : 37 - 48
  • [37] Attentive Learning Facilitates Generalization of Neural Networks
    Lei, Shiye
    He, Fengxiang
    Chen, Haowen
    Tao, Dacheng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14
  • [38] Generalization and Representational Limits of Graph Neural Networks
    Garg, Vikas K.
    Jegelka, Stefanie
    Jaakkola, Tommi
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [39] Stability and Generalization of Graph Convolutional Neural Networks
    Verma, Saurabh
    Zhang, Zhi-Li
    [J]. KDD'19: PROCEEDINGS OF THE 25TH ACM SIGKDD INTERNATIONAL CONFERENCCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2019, : 1539 - 1548
  • [40] Coherent optical neural networks and the generalization characteristics
    Hirose, A
    Eckmiller, R
    [J]. OPTICAL REVIEW, 1996, 3 (6A) : 418 - 422