DropELM: Fast neural network regularization with Dropout and DropConnect

被引:40
|
作者
Iosifidis, Alexandros [1 ]
Tefas, Anastasios [1 ]
Pitas, Ioannis [1 ]
机构
[1] Aristotle Univ Thessaloniki, Dept Informat, Thessaloniki 54124, Greece
关键词
Single Hidden Layer Feedforward Networks; Extreme Learning Machine; Regularization; Dropout; DropConnect; EXTREME LEARNING-MACHINE; FACE RECOGNITION; CLASSIFICATION;
D O I
10.1016/j.neucom.2015.04.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is adopted by the proposed DropELM network. The proposed algorithm is able to exploit Dropout and DropConnect regularization, without computationally intensive iterative weight tuning. We show that the adoption of such a regularization approach can lead to better solutions for the network output weights. We incorporate the proposed regularization approach in several recently proposed ELM algorithms and show that their performance can be enhanced without requiring much additional computational cost. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 66
页数:10
相关论文
共 50 条
  • [31] Controlled Dropout: a Different Dropout for Improving Training Speed on Deep Neural Network
    Ko, ByungSoo
    Kim, Han-Gyu
    Choi, Ho-Jin
    2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 972 - 977
  • [32] Integrating Dropout Regularization Technique at Different Layers to Improve the Performance of Neural Networks
    Pansambal, B. H.
    Nandgaokar, A. B.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2023, 14 (04) : 716 - 722
  • [33] CamDrop: A New Explanation of Dropout and A Guided Regularization Method for Deep Neural Networks
    Wang, Hongjun
    Wang, Guangrun
    Li, Guanbin
    Lin, Liang
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 1141 - 1149
  • [34] A Review on Dropout Regularization Approaches for Deep Neural Networks within the Scholarly Domain
    Salehin, Imrus
    Kang, Dae-Ki
    ELECTRONICS, 2023, 12 (14)
  • [35] Hybridized sine cosine algorithm with convolutional neural networks dropout regularization application
    Bacanin, Nebojsa
    Zivkovic, Miodrag
    Al-Turjman, Fadi
    Venkatachalam, K.
    Trojovsky, Pavel
    Strumberger, Ivana
    Bezdan, Timea
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [36] Batch Normalization and Dropout Regularization in Training Deep Neural Networks with Label Noise
    Rusiecki, Andrzej
    INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, ISDA 2021, 2022, 418 : 57 - 66
  • [37] ISING-DROPOUT: A REGULARIZATION METHOD FOR TRAINING AND COMPRESSION OF DEEP NEURAL NETWORKS
    Salehinejad, Hojjat
    Valaee, Shahrokh
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3602 - 3606
  • [38] Hybridized sine cosine algorithm with convolutional neural networks dropout regularization application
    Nebojsa Bacanin
    Miodrag Zivkovic
    Fadi Al-Turjman
    K. Venkatachalam
    Pavel Trojovský
    Ivana Strumberger
    Timea Bezdan
    Scientific Reports, 12
  • [39] On Dropout and Nuclear Norm Regularization
    Mianjy, Poorya
    Arora, Raman
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [40] Supervision dropout: guidance learning in deep neural network
    Liang Zeng
    Hao Zhang
    Yanyan Li
    Maodong Li
    Shanshan Wang
    Multimedia Tools and Applications, 2023, 82 : 18831 - 18850