DropELM: Fast neural network regularization with Dropout and DropConnect

被引:40
|
作者
Iosifidis, Alexandros [1 ]
Tefas, Anastasios [1 ]
Pitas, Ioannis [1 ]
机构
[1] Aristotle Univ Thessaloniki, Dept Informat, Thessaloniki 54124, Greece
关键词
Single Hidden Layer Feedforward Networks; Extreme Learning Machine; Regularization; Dropout; DropConnect; EXTREME LEARNING-MACHINE; FACE RECOGNITION; CLASSIFICATION;
D O I
10.1016/j.neucom.2015.04.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose an extension of the Extreme Learning Machine algorithm for Single-hidden Layer Feedforward Neural network training that incorporates Dropout and DropConnect regularization in its optimization process. We show that both types of regularization lead to the same solution for the network output weights calculation, which is adopted by the proposed DropELM network. The proposed algorithm is able to exploit Dropout and DropConnect regularization, without computationally intensive iterative weight tuning. We show that the adoption of such a regularization approach can lead to better solutions for the network output weights. We incorporate the proposed regularization approach in several recently proposed ELM algorithms and show that their performance can be enhanced without requiring much additional computational cost. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 66
页数:10
相关论文
共 50 条
  • [21] Dropout Effect On Probabilistic Neural Network
    Shahadat, Nazmul
    Rahman, Bushra
    Ahmed, Faisal
    Anwar, Ferdows
    2017 INTERNATIONAL CONFERENCE ON ELECTRICAL, COMPUTER AND COMMUNICATION ENGINEERING (ECCE), 2017, : 217 - 222
  • [22] Node Classification Using Graph Convolutional Network with Dropout Regularization
    Xiao, Bing-Yu
    Tseng, Chien-Cheng
    Lee, Su-Ling
    2021 IEEE REGION 10 CONFERENCE (TENCON 2021), 2021, : 84 - 87
  • [23] Modeling and implementation of a novel active voltage balancing circuit using deep recurrent neural network with dropout regularization
    Noohi, Mostafa
    Faraji, Amin
    Sadrossadat, Sayed Alireza
    Mirvakili, Ali
    Moftakharzadeh, Ali
    INTERNATIONAL JOURNAL OF CIRCUIT THEORY AND APPLICATIONS, 2023, 51 (05) : 2351 - 2374
  • [24] Implicit Regularization of Dropout
    Zhang, Zhongwang
    Xu, Zhi-Qin John
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (06) : 4206 - 4217
  • [25] Analysis on Dropout Regularization
    Sum, John
    Leung, Chi-Sing
    NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 253 - 261
  • [26] Optical random micro-phase-shift DropConnect in a diffractive deep neural network
    Xiao, Yong-Liang
    Li, Sikun
    Situ, Guohai
    Zhong, Jianxin
    OPTICS LETTERS, 2022, 47 (07) : 1746 - 1749
  • [27] Controlled Dropout: a Different Approach to Using Dropout on Deep Neural Network
    Ko, ByungSoo
    Kim, Han-Gyu
    Oh, Kyo-Joong
    Choi, Ho-Jin
    2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2017, : 358 - 362
  • [28] Enhancement of Deep Architecture using Dropout / DropConnect Techniques Applied for AHR System
    Elleuch, Mohamed
    Alimi, Adel M.
    Kherallah, Monji
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [29] NEUFAIR: Neural Network Fairness Repair with Dropout
    Dasu, Vishnu Asutosh
    Kumar, Ashish
    Tizpaz-Niari, Saeid
    Tan, Gang
    PROCEEDINGS OF THE 33RD ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2024, 2024, : 1541 - 1553
  • [30] Adaptive regularization in neural network modeling
    Larsen, J
    Svarer, C
    Andersen, LN
    Hansen, LK
    NEURAL NETWORKS: TRICKS OF THE TRADE, 1998, 1524 : 113 - 132