A COMPARATIVE STUDY OF DATA SAMPLING TECHNIQUES FOR CONSTRUCTING NEURAL NETWORK ENSEMBLES

被引:19
|
作者
Akhand, M. A. H. [1 ]
Islam, M. D. Monirul [1 ]
Murase, Kazuyuki [1 ,2 ]
机构
[1] Univ Fukui, Grad Sch Engn, Fukui 9108507, Japan
[2] Univ Fukui, Res & Educ Program Life Sci, Fukui 9108507, Japan
关键词
Neural network ensemble; generalization; diversity; bagging; boosting; negative correlation learning; random subspace method; GRADIENT LEARNING ALGORITHM; CLASSIFICATION;
D O I
10.1142/S0129065709001859
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Ensembles with several classifiers (such as neural networks or decision trees) are widely used to improve the generalization performance over a single classifier. Proper diversity among component classifiers is considered an important parameter for ensemble construction so that failure of one may be compensated by others. Among various approaches, data sampling, i.e., different data sets for different classifiers, is found more effective than other approaches. A number of ensemble methods have been proposed under the umbrella of data sampling in which some are constrained to neural networks or decision trees and others are commonly applicable to both types of classifiers. We studied prominent data sampling techniques for neural network ensembles, and then experimentally evaluated their effectiveness on a common test ground. Based on overlap and uncover, the relation between generalization and diversity is presented. Eight ensemble methods were tested on 30 benchmark classification problems. We found that bagging and boosting, the pioneer ensemble methods, are still better than most of the other proposed methods. However, negative correlation learning that implicitly encourages different networks to different training spaces is shown as better or at least comparable to bagging and boosting that explicitly create different training spaces.
引用
收藏
页码:67 / 89
页数:23
相关论文
共 50 条
  • [1] Short-Term Load Forecasting with Neural Network Ensembles: A Comparative Study
    De Felice, Matteo
    Yao, Xin
    [J]. IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2011, 6 (03) : 47 - 56
  • [2] A dynamic method that emphasizes diversity for constructing ensembles of neural network classifiers
    Zheng, JJ
    Gan, RC
    Wang, JX
    [J]. PROCEEDINGS OF THE 2005 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND BRAIN, VOLS 1-3, 2005, : 763 - 766
  • [3] On the effectiveness of bootstrap aggregation for constructing neural network ensembles for QSAR and QSPR.
    Agrafiotis, DK
    Izrailev, S
    Cedeno, W
    [J]. ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2002, 224 : U512 - U512
  • [4] On the effectiveness of bootstrap aggregation for constructing neural network ensembles for QSAR and QSPR.
    Agrafiotis, DK
    Cedeno, W
    Lobanov, VS
    [J]. ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2002, 224 : U508 - U508
  • [5] A study of constructing dynamic neural network
    Pan, H
    Tian, H
    [J]. ICIA 2004: Proceedings of 2004 International Conference on Information Acquisition, 2004, : 13 - 15
  • [6] A Comparative Study on Sampling Techniques for Handling Class Imbalance in Streaming Data
    Nguyen, Hien M.
    Cooper, Eric W.
    Kamei, Katsuari
    [J]. 6TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS, AND THE 13TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS, 2012, : 1762 - 1767
  • [7] Unbiased sampling of network ensembles
    Squartini, Tiziano
    Mastrandrea, Rossana
    Garlaschelli, Diego
    [J]. NEW JOURNAL OF PHYSICS, 2015, 17
  • [8] Convolutional neural network applied to detect electricity theft: A comparative study on unbalanced data handling techniques
    Pereira, Jeanne
    Saraiva, Filipe
    [J]. INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2021, 131
  • [9] NEURAL NETWORK ENSEMBLES
    HANSEN, LK
    SALAMON, P
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1990, 12 (10) : 993 - 1001
  • [10] A comparative study of artificial neural network techniques for river stage forecasting
    Dawson, CW
    See, LM
    Abrahart, RJ
    Wilby, RL
    Shamseldin, AY
    Anctil, F
    Belbachir, AN
    Bowden, G
    Dandy, G
    Lauzon, N
    Maier, H
    Mason, G
    [J]. Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vols 1-5, 2005, : 2666 - 2670