DropELM: Fast neural network regularization with Dropout and DropConnect

被引:40
|
作者
Iosifidis, Alexandros [1 ]
Tefas, Anastasios [1 ]
Pitas, Ioannis [1 ]
机构
[1] Aristotle Univ Thessaloniki, Dept Informat, Thessaloniki 54124, Greece
关键词
Single Hidden Layer Feedforward Networks; Extreme Learning Machine; Regularization; Dropout; DropConnect; EXTREME LEARNING-MACHINE; FACE RECOGNITION; CLASSIFICATION;
D O I
10.1016/j.neucom.2015.04.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is adopted by the proposed DropELM network. The proposed algorithm is able to exploit Dropout and DropConnect regularization, without computationally intensive iterative weight tuning. We show that the adoption of such a regularization approach can lead to better solutions for the network output weights. We incorporate the proposed regularization approach in several recently proposed ELM algorithms and show that their performance can be enhanced without requiring much additional computational cost. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 66
页数:10
相关论文
共 50 条
  • [41] Supervision dropout: guidance learning in deep neural network
    Zeng, Liang
    Zhang, Hao
    Li, Yanyan
    Li, Maodong
    Wang, Shanshan
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (12) : 18831 - 18850
  • [42] Multiple Explanations for Neural Network Based Dropout Prediction
    Lu, Junling
    Wu, Renran
    Li, Peng
    2024 4TH INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE, CCAI 2024, 2024, : 247 - 252
  • [43] Neural Network Algorithm with Dropout Using Elite Selection
    Wang, Yong
    Wang, Kunzhao
    Wang, Gaige
    MATHEMATICS, 2022, 10 (11)
  • [44] BP Neural Network PID controller of pocket dropout
    Jing, Shuangxi
    Guo, Songtao
    Zhao, Xingyu
    Ren, Xiaoming
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATIONS (ICCC), 2015, : 67 - 71
  • [45] Rademacher dropout: An adaptive dropout for deep neural network via optimizing generalization gap
    Wang, Haotian
    Yang, Wenjing
    Zhao, Zhenyu
    Luo, Tingjin
    Wang, Ji
    Tang, Yuhua
    NEUROCOMPUTING, 2019, 357 : 177 - 187
  • [46] A Neural Network Sparseness Algorithm Based on Relevance Dropout
    Wang, Jianjun
    Liu, Leshan
    2019 IEEE 6TH INTERNATIONAL CONFERENCE ON INDUSTRIAL ENGINEERING AND APPLICATIONS (ICIEA), 2019, : 480 - 484
  • [47] Regularization neural network for construction cost estimation
    Adeli, H
    Wu, MY
    JOURNAL OF CONSTRUCTION ENGINEERING AND MANAGEMENT, 1998, 124 (01) : 18 - 24
  • [48] A Hybrid Improved Neural Networks Algorithm Based on L2 and Dropout Regularization
    Xie, Xiaoyun
    Xie, Ming
    Moshayedi, Ata Jahangir
    Skandari, Mohammad Hadi Noori
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [49] Regularization method for reduced biquaternion neural network
    Gai, Shan
    Huang, Xiang
    APPLIED SOFT COMPUTING, 2024, 166
  • [50] Batch Contrastive Regularization for Deep Neural Network
    Tanveer, Muhammad
    Tan, Hung Khoon
    Ng, Hui Fuang
    Leung, Maylor Karhang
    Chuah, Joon Huang
    PROCEEDINGS OF THE 12TH INTERNATIONAL JOINT CONFERENCE ON COMPUTATIONAL INTELLIGENCE (IJCCI), 2020, : 368 - 377