Stacked generalization in neural networks: Generalization on statistically neutral problems

被引:0
|
作者
Ghorbani, AA [1 ]
Owrangh, K [1 ]
机构
[1] Univ New Brunswick, Fac Comp Sci, Fredericton, NB E3B 5A3, Canada
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generalization continues to be one of the most important topic in neural networks and other classifiers. In the last number of years, number of different methods have been developed to improve generalization accuracy. Any classifier that uses induction to find the class concept from the training patterns will have hard time to achieve acceptable level of generalization accuracy when the problem to be learned is a statistically neutral problem [4, 7]. A problem is statistically neutral if the probability of mapping an input onto an output is always the chance value of 0.5. In this paper, we examine the generalization behaviour of the multilayer neural networks on learning statistically neutral problems using single level learning models (eg., conventional cross-validation scheme) as well as multiple level learning models (eg., Stacked generalization method). We show that for statistically neutral problems such as parity and majority function, Stacked generalization scheme improves classification performance and generalization accuracy over single level cross-validation model.
引用
收藏
页码:1715 / 1720
页数:6
相关论文
共 50 条
  • [21] Slope and Generalization Properties of Neural Networks
    Johansson, Anton
    Engsner, Niklas
    Strannegard, Claes
    Mostad, Petter
    [J]. 2022 34TH WORKSHOP OF THE SWEDISH ARTIFICIAL INTELLIGENCE SOCIETY (SAIS 2022), 2022, : 28 - 36
  • [22] On the Provable Generalization of Recurrent Neural Networks
    Wang, Lifu
    Shen, Bo
    Hu, Bo
    Cao, Xing
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [23] Using Supervised Pretraining to Improve Generalization of Neural Networks on Binary Classification Problems
    Peng, Alex Yuxuan
    Koh, Yun Sing
    Riddle, Patricia
    Pfahringer, Bernhard
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT I, 2019, 11051 : 410 - 425
  • [24] Neural Tangent Kernel: Convergence and Generalization in Neural Networks
    Jacot, Arthur
    Gabriel, Franck
    Hongler, Clement
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [25] Stacked generalization and simulated evolution
    English, TM
    [J]. BIOSYSTEMS, 1996, 39 (01) : 3 - 18
  • [26] Stacked generalization for information extraction
    Sigletos, G
    Paliouras, G
    Spyropoulos, CD
    Stamatopoulos, T
    [J]. ECAI 2004: 16TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 110 : 549 - 553
  • [27] Genetic programming for stacked generalization
    Bakurov, Illya
    Castelli, Mauro
    Gau, Olivier
    Fontanella, Francesco
    Vanneschi, Leonardo
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2021, 65
  • [28] On the performance of Stacked Generalization classifiers
    Ozay, Mete
    Vural, Fatos Tunay Yarman
    [J]. IMAGE ANALYSIS AND RECOGNITION, PROCEEDINGS, 2008, 5112 : 445 - +
  • [29] Generalization and Representational Limits of Graph Neural Networks
    Garg, Vikas K.
    Jegelka, Stefanie
    Jaakkola, Tommi
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [30] Improving generalization capabilities of dynamic neural networks
    Galicki, M
    Leistritz, L
    Zwick, EB
    Witte, H
    [J]. NEURAL COMPUTATION, 2004, 16 (06) : 1253 - 1282