Revisiting spatial dropout for regularizing convolutional neural networks

被引:0
|
作者
Sanghun Lee
Chulhee Lee
机构
[1] Yonsei University,Department of Electrical and Electronic Engineering
来源
关键词
Network regularization; Convolutional neural network; Spatial dropout; Deep learning;
D O I
暂无
中图分类号
学科分类号
摘要
Overfitting is one of the most challenging problems in deep neural networks with a large number of trainable parameters. To prevent networks from overfitting, the dropout method, which is a strong regularization technique, has been widely used in fully-connected neural networks. In several state-of-the-art convolutional neural network architectures for object classification, however, dropout was partially or not even applied since its accuracy gain was relatively insignificant in most cases. Also, the batch normalization technique reduced the need for the dropout method because of its regularization effect. In this paper, we show that conventional element-wise dropout can be ineffective for convolutional layers. We found that dropout between channels in the CNNs can be functionally similar to dropout in the FCNNs, and spatial dropout can be an effective way to take advantage of the dropout technique for regularizing. To prove our points, we conducted several experiments using the CIFAR-10 and CIFAR-100 databases. For comparison, we only replaced the dropout layers with spatial dropout layers and kept all other hyperparameters and methods intact. DenseNet-BC with spatial dropout showed promising results (3.32% error rates with CIFAR-10, 3.0 M parameters) compared to other existing competitive methods.
引用
收藏
页码:34195 / 34207
页数:12
相关论文
共 50 条
  • [1] Revisiting spatial dropout for regularizing convolutional neural networks
    Lee, Sanghun
    Lee, Chulhee
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (45-46) : 34195 - 34207
  • [2] Dropout with Tabu Strategy for Regularizing Deep Neural Networks
    Ma, Zongjie
    Sattar, Abdul
    Zhou, Jun
    Chen, Qingliang
    Su, Kaile
    [J]. COMPUTER JOURNAL, 2020, 63 (07): : 1031 - 1038
  • [3] Towards dropout training for convolutional neural networks
    Wu, Haibing
    Gu, Xiaodong
    [J]. NEURAL NETWORKS, 2015, 71 : 1 - 10
  • [4] Analysis on the Dropout Effect in Convolutional Neural Networks
    Park, Sungheon
    Kwak, Nojun
    [J]. COMPUTER VISION - ACCV 2016, PT II, 2017, 10112 : 189 - 204
  • [5] Checkerboard Dropout: A Structured Dropout With Checkerboard Pattern for Convolutional Neural Networks
    Nguyen, Khanh-Binh
    Choi, Jaehyuk
    Yang, Joon-Sung
    [J]. IEEE ACCESS, 2022, 10 : 76044 - 76054
  • [6] Regularizing Deep Convolutional Neural Networks with a Structured Decorrelation Constraint
    Xiong, Wei
    Du, Bo
    Zhang, Lefei
    Hu, Ruimin
    Tao, Dacheng
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 519 - 528
  • [7] PREDICTING UNIVERSITY DROPOUT BY USING CONVOLUTIONAL NEURAL NETWORKS
    Mezzini, Mauro
    Bonavolonta, Gianmarco
    Agrusti, Francesco
    [J]. 13TH INTERNATIONAL TECHNOLOGY, EDUCATION AND DEVELOPMENT CONFERENCE (INTED2019), 2019, : 9155 - 9163
  • [8] CORRDROP: CORRELATION BASED DROPOUT FOR CONVOLUTIONAL NEURAL NETWORKS
    Zeng, Yuyuan
    Dai, Tao
    Xia, Shu-Tao
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3742 - 3746
  • [9] Revisiting Edge Detection in Convolutional Neural Networks
    Minh Le
    Kayal, Subhradeep
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] Depth Dropout: Efficient Training of Residual Convolutional Neural Networks
    Guo, Jian
    Gould, Stephen
    [J]. 2016 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2016, : 343 - 349