DropELM: Fast neural network regularization with Dropout and DropConnect

被引:40
|
作者
Iosifidis, Alexandros [1 ]
Tefas, Anastasios [1 ]
Pitas, Ioannis [1 ]
机构
[1] Aristotle Univ Thessaloniki, Dept Informat, Thessaloniki 54124, Greece
关键词
Single Hidden Layer Feedforward Networks; Extreme Learning Machine; Regularization; Dropout; DropConnect; EXTREME LEARNING-MACHINE; FACE RECOGNITION; CLASSIFICATION;
D O I
10.1016/j.neucom.2015.04.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is adopted by the proposed DropELM network. The proposed algorithm is able to exploit Dropout and DropConnect regularization, without computationally intensive iterative weight tuning. We show that the adoption of such a regularization approach can lead to better solutions for the network output weights. We incorporate the proposed regularization approach in several recently proposed ELM algorithms and show that their performance can be enhanced without requiring much additional computational cost. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 66
页数:10
相关论文
共 50 条
  • [1] A Study on Comparative Analysis of the Effect of Applying DropOut and DropConnect to Deep Neural Network
    Lim, Hyun-il
    INTELLIGENT HUMAN COMPUTER INTERACTION, PT I, 2021, 12615 : 42 - 47
  • [2] Dropout and DropConnect based Ensemble of Random Vector Functional Link Neural Network
    Katuwal, Rakesh
    Suganthan, P. N.
    2018 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2018, : 1772 - 1778
  • [3] DropConnect Regularization Method with Sparsity Constraint for Neural Networks
    LIAN Zifeng
    JING Xiaojun
    WANG Xiaohan
    HUANG Hai
    TAN Youheng
    CUI Yuanhao
    ChineseJournalofElectronics, 2016, 25 (01) : 152 - 158
  • [4] DropConnect Regularization Method with Sparsity Constraint for Neural Networks
    Lian Zifeng
    Jing Xiaojun
    Wang Xiaohan
    Huang Hai
    Tan Youheng
    Cui Yuanhao
    CHINESE JOURNAL OF ELECTRONICS, 2016, 25 (01) : 152 - 158
  • [5] Joint Inference for Neural Network Depth and Dropout Regularization
    Kishan, K. C.
    Li, Rui
    Gilany, Mandi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [6] Biased Dropout and Crossmap Dropout: Learning towards effective Dropout regularization in convolutional neural network
    Poernomo, Alvin
    Kang, Dae-Ki
    NEURAL NETWORKS, 2018, 104 : 60 - 67
  • [7] Mixed-pooling-dropout for convolutional neural network regularization
    Skourt, Brahim Ait
    El Hassani, Abdelhamid
    Majda, Aicha
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2022, 34 (08) : 4756 - 4762
  • [8] Weighted Channel Dropout for Regularization of Deep Convolutional Neural Network
    Hou, Saihui
    Wang, Zilei
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 8425 - 8432
  • [9] Dropout and DropConnect for Reliable Neuromorphic Inference Under Communication Constraints in Network Connectivity
    Sakai, Yasufumi
    Pedroni, Bruno U.
    Joshi, Siddharth
    Tanabe, Satoshi
    Akinin, Abraham
    Cauwenberghs, Gert
    IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2019, 9 (04) : 658 - 667
  • [10] DropOut and DropConnect for Reliable Neuromorphic Inference under Energy and Bandwidth Constraints in Network Connectivity
    Sakai, Yasufumi
    Pedroni, Bruno U.
    Joshi, Siddharth
    Akinin, Abraham
    Cauwenberghs, Gert
    2019 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2019), 2019, : 76 - 80