Parallel Approach for Ensemble Learning with Locally Coupled Neural Networks

被引:0
|
作者
Carlos Valle
Francisco Saravia
Héctor Allende
Raúl Monge
César Fernández
机构
[1] Universidad Técnica Federico Santa María,Departamento de Informática
来源
Neural Processing Letters | 2010年 / 32卷
关键词
Ensemble learning; Artificial neural networks; Parallel algorithms; Local negative correlation; Resampling;
D O I
暂无
中图分类号
学科分类号
摘要
Ensemble learning has gained considerable attention in different tasks including regression, classification and clustering. Adaboost and Bagging are two popular approaches used to train these models. The former provides accurate estimations in regression settings but is computationally expensive because of its inherently sequential structure, while the latter is less accurate but highly efficient. One of the drawbacks of the ensemble algorithms is the high computational cost of the training stage. To address this issue, we propose a parallel implementation of the Resampling Local Negative Correlation (RLNC) algorithm for training a neural network ensemble in order to acquire a competitive accuracy like that of Adaboost and an efficiency comparable to that of Bagging. We test our approach on both synthetic and real datasets from the UCI and Statlib repositories for the regression task. In particular, our fine-grained parallel approach allows us to achieve a satisfactory balance between accuracy and parallel efficiency.
引用
收藏
页码:277 / 291
页数:14
相关论文
共 50 条
  • [1] Parallel Approach for Ensemble Learning with Locally Coupled Neural Networks
    Valle, Carlos
    Saravia, Francisco
    Allende, Hector
    Monge, Raul
    Fernandez, Cesar
    [J]. NEURAL PROCESSING LETTERS, 2010, 32 (03) : 277 - 291
  • [2] A parallel learning approach for neural network ensemble
    Wang, ZQ
    Chen, SF
    Chen, ZQ
    Xie, JY
    [J]. AI 2004: ADVANCES IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 3339 : 1200 - 1205
  • [3] Ensemble Deep TimeNet : An Ensemble Learning Approach with Deep Neural Networks for Time Series
    Pathak, Sudipta
    Cai, Xingyu
    Rajasekaran, Sanguthevar
    [J]. 2018 IEEE 8TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL ADVANCES IN BIO AND MEDICAL SCIENCES (ICCABS), 2018,
  • [4] Deep Learning Convolutional Neural Networks with Dropout - a Parallel Approach
    Shen, Jingyi
    Shafiq, M. Omair
    [J]. 2018 17TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2018, : 572 - 577
  • [5] Parallel ensemble learning of convolutional neural networks and local binary patterns for face recognition
    Tang, Jialin
    Su, Qinglang
    Su, Binghua
    Fong, Simon
    Cao, Wei
    Gong, Xueyuan
    [J]. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE, 2020, 197
  • [6] Bushing diagnostics using an ensemble of parallel neural networks
    Dhlamini, SM
    Marwala, T
    [J]. Proceedings of the 2005 International Symposium on Electrical Insulating Materials, Vols, 1-3, 2005, : 289 - 292
  • [7] The Parallel Approach to the Conjugate Gradient Learning Algorithm for the Feedforward Neural Networks
    Bilski, Jaroslaw
    Smolag, Jacek
    Galushkin, Alexander I.
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING ICAISC 2014, PT I, 2014, 8467 : 12 - 21
  • [8] A dynamic ensemble learning algorithm for neural networks
    Kazi Md. Rokibul Alam
    Nazmul Siddique
    Hojjat Adeli
    [J]. Neural Computing and Applications, 2020, 32 : 8675 - 8690
  • [9] Observational Learning Algorithm for an Ensemble of Neural Networks
    Min Jang
    Sungzoon Cho
    [J]. Pattern Analysis & Applications, 2002, 5 : 154 - 167
  • [10] Observational Learning Algorithm for an ensemble of neural networks
    Jang, M
    Cho, SZ
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2002, 5 (02) : 154 - 167