Regularized Negative Correlation Learning for Neural Network Ensembles

被引:95
|
作者
Chen, Huanhuan [1 ]
Yao, Xin [1 ]
机构
[1] Univ Birmingham, Sch Comp Sci, CERCIA, Birmingham B15 2TT, W Midlands, England
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2009年 / 20卷 / 12期
基金
英国工程与自然科学研究理事会;
关键词
Ensembles; negative correlation learning (NCL); neural network ensembles; neural networks; probabilistic model; regularization; VECTOR; CLASSIFIERS; REGRESSION; ALGORITHM;
D O I
10.1109/TNN.2009.2034144
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Negative correlation learning (NCL) is a neural network ensemble learning algorithm that introduces a correlation penalty term to the cost function of each individual network so that each neural network minimizes its mean square error (MSE) together with the correlation of the ensemble. This paper analyzes NCL and reveals that the training of NCL (when lambda = 1) corresponds to training the entire ensemble as a single learning machine that only minimizes the MSE without regularization. This analysis explains the reason why NCL is prone to overfitting the noise in the training set. This paper also demonstrates that tuning the correlation parameter in NCL by cross validation cannot overcome the overfitting problem. The paper analyzes this problem and proposes the regularized negative correlation learning (RNCL) algorithm which incorporates an additional regularization term for the whole ensemble. RNCL decomposes the ensemble's training objectives, including MSE and regularization, into a set of sub-objectives, and each sub-objective is implemented by an individual neural network. In this paper, we also provide a Bayesian interpretation for RNCL and provide an automatic algorithm to optimize regularization parameters based on Bayesian inference. The RNCL formulation is applicable to any nonlinear estimator minimizing the MSE. The experiments on synthetic as well as real-world data sets demonstrate that RNCL achieves better performance than NCL, especially when the noise level is nontrivial in the data set.
引用
收藏
页码:1962 / 1979
页数:18
相关论文
共 50 条
  • [41] Weighted combination of neural network ensembles
    Wanas, NM
    Kamel, MS
    [J]. PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1748 - 1752
  • [42] Clustering ensembles of neural network models
    Bakker, B
    Heskes, T
    [J]. NEURAL NETWORKS, 2003, 16 (02) : 261 - 269
  • [43] Decision fusion in neural network ensembles
    Wanas, NM
    Kamel, M
    [J]. IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 2952 - 2957
  • [44] Visualizing Forecasts of Neural Network Ensembles
    von Metthenheim, Hans-Joerg
    Koepp, Cornelius
    Breitner, Michael H.
    [J]. OPERATIONS RESEARCH PROCEEDINGS 2011, 2012, : 573 - 578
  • [45] Global and local neural network ensembles
    Sierra, A
    Santa Cruz, C
    [J]. PATTERN RECOGNITION LETTERS, 1998, 19 (08) : 651 - 655
  • [46] Obtaining accurate neural network ensembles
    Johansson, Ulf
    Lofstrom, Tuve
    Niklasson, Lars
    [J]. INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE FOR MODELLING, CONTROL & AUTOMATION JOINTLY WITH INTERNATIONAL CONFERENCE ON INTELLIGENT AGENTS, WEB TECHNOLOGIES & INTERNET COMMERCE, VOL 2, PROCEEDINGS, 2006, : 103 - +
  • [47] Evolving artificial neural network ensembles
    Reynolds, Robert G.
    Ali, Mostafa
    Yao, Xin
    Islam, Md. Monirul
    [J]. IEEE COMPUTATIONAL INTELLIGENCE MAGAZINE, 2008, 3 (01) : 31 - 42
  • [48] Manifold Regularized Convolutional Neural Network
    Samad, Manar D.
    Sekmen, Ali
    [J]. 2021 7TH INTERNATIONAL CONFERENCE ON ENGINEERING AND EMERGING TECHNOLOGIES (ICEET 2021), 2021, : 55 - 60
  • [49] Lithology recognition by neural network ensembles
    dos Santos, RV
    Artola, F
    da Fontoura, S
    Vellasco, M
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2002, 2507 : 302 - 312
  • [50] Diversity and Generalization in Neural Network Ensembles
    Ortega, Luis A.
    Cabanas, Rafael
    Masegosa, Andres R.
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151