Ensemble Approach for Improving Generalization Ability of Neural Networks

被引:0
|
作者
Ahmed, Shaib [1 ]
Razib, Md. Razibul Islam [2 ]
Alam, Mohammed Shamsul [2 ]
Alam, Mohammad Shafiul [3 ]
Huda, Mohammad Nurul [1 ]
机构
[1] United Int Univ, Dhaka, Bangladesh
[2] IIUC, Dhaka, Bangladesh
[3] Ahsanullah Univ Sci & Technol, Dhaka, Bangladesh
关键词
Accuracy; Diversity; Ensemble; Neural Network; Independent Training; Majority Voting; Simple Averaging;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This paper presents a study on improving generalization ability of neural networks (NNs) by using ensemble approach. In already existing literature, both theoretical and experimental studies have revealed that the performance, i.e., generalization ability of NN ensemble is greatly dependent on both accuracy and diversity among individual NNs in the ensemble. In this study and implementation of NN ensemble, Back Propagation (BP) learning algorithm is used to train individual NNs independently for a fixed number of training epoches. We have considered 12 different benchmark problems in our study. Few papers have considered such a large number of problems. The experimental results show that the performance of NN ensemble is often better than individual NNs, and both accuracy and diversity among participating networks are important for the generalization ability of the ensemble.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] An Empirical Study on Improving the Speed and Generalization of Neural Networks Using a Parallel Circuit Approach
    Kien Tuong Phan
    Tomas Henrique Maul
    Tuong Thuy Vu
    [J]. International Journal of Parallel Programming, 2017, 45 : 780 - 796
  • [22] An Empirical Study on Improving the Speed and Generalization of Neural Networks Using a Parallel Circuit Approach
    Kien Tuong Phan
    Maul, Tomas Henrique
    Tuong Thuy Vu
    [J]. INTERNATIONAL JOURNAL OF PARALLEL PROGRAMMING, 2017, 45 (04) : 780 - 796
  • [23] Hybrid methodology for improving generalization performance of neural networks
    Yang, Zhong-Jin
    [J]. PROCEEDINGS OF 2006 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2006, : 3167 - 3172
  • [24] Bayesian model comparison versus generalization ability of neural networks
    Gomari, M
    Järvi, T
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL I AND II, 1999, : 537 - 541
  • [25] Generalization ability of Boolean functions implemented in feedforward neural networks
    Franco, Leonardo
    [J]. NEUROCOMPUTING, 2006, 70 (1-3) : 351 - 361
  • [26] Quantitative analysis of the generalization ability of deep feedforward neural networks
    Yang, Yanli
    Li, Chenxia
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 40 (03) : 4867 - 4876
  • [27] Manipulation of Hidden Layers to Improve the Generalization Ability of Neural Networks
    Wongsinlatam, Wullapa
    [J]. PROGRESS IN APPLIED MATHEMATICS IN SCIENCE AND ENGINEERING PROCEEDINGS, 2016, 1705
  • [28] Improving the Performance of Neural Networks with an Ensemble of Activation Functions
    Nandi, Arijit
    Jana, Nanda Dulal
    Das, Swagatam
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [29] Improving generalization ability of multilayer networks by excluding irrelevant input components
    Ishii, M
    Kumazawa, I
    [J]. IEEE 2000 ADAPTIVE SYSTEMS FOR SIGNAL PROCESSING, COMMUNICATIONS, AND CONTROL SYMPOSIUM - PROCEEDINGS, 2000, : 203 - 206
  • [30] EnNeRFACE: improving the generalization of face reenactment with adaptive ensemble neural radiance fields
    Shuai Yang
    Kai Qiao
    Shuhao Shi
    Linyuan Wang
    Guoen Hu
    Bin Yan
    Jian Chen
    [J]. The Visual Computer, 2023, 39 : 6015 - 6028