A Dropout Distribution Model on Deep Networks

被引:0
|
作者
Li, Fengqi [1 ]
Yang, Helin [1 ]
机构
[1] Dalian Univ Technol, Sch Software Technol, Econ & Technol Dev Area, Dalian 116620, Peoples R China
关键词
Deep neural networks; Dropout; Rate's distribution;
D O I
10.1117/12.2243971
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Dropout is proved to have a good ability of controlling overfitting and improving deep networks' generalization. However, dropout adopts a constant rate to train the parameters of each layer, reducing the classification accuracy and efficiency. Aiming at this problem, the paper proposes a dropout rate distribution model by analyzing the relationship between the dropout rate and the layers of the deep network. First, we gave the formal description of the dropout rate to reveal the relationship between the dropout rate and the layers of the deep network. Second, we proposed a distribution model for determining the dropout rate in each layer training. Experiments are performed on MNIST and CIFAR-10 datasets to evaluate the performance of the proposed model by comparison with networks of constant dropout rates. Experimental results demonstrate that our proposed model performs better than the conventional dropout in classification accuracy and efficiency.
引用
收藏
页数:7
相关论文
共 50 条
  • [41] ISING-DROPOUT: A REGULARIZATION METHOD FOR TRAINING AND COMPRESSION OF DEEP NEURAL NETWORKS
    Salehinejad, Hojjat
    Valaee, Shahrokh
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3602 - 3606
  • [42] Dropout in Neural Networks Simulates the Paradoxical Effects of Deep Brain Stimulation on Memory
    Tan, Shawn Zheng Kai
    Du, Richard
    Perucho, Jose Angelo Udal
    Chopra, Shauhrat S.
    Vardhanabhuti, Varut
    Lim, Lee Wei
    [J]. FRONTIERS IN AGING NEUROSCIENCE, 2020, 12
  • [43] Batch Normalization and Dropout Regularization in Training Deep Neural Networks with Label Noise
    Rusiecki, Andrzej
    [J]. INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, ISDA 2021, 2022, 418 : 57 - 66
  • [44] Online Arabic Handwriting Recognition with Dropout applied in Deep Recurrent Neural Networks
    Maalej, Rania
    Tagougui, Najiba
    Kherallah, Monji
    [J]. PROCEEDINGS OF 12TH IAPR WORKSHOP ON DOCUMENT ANALYSIS SYSTEMS, (DAS 2016), 2016, : 417 - 421
  • [45] Predicting Student Dropout in a MOOC: An Evaluation of a Deep Neural Network Model
    Imran, Ali Shariq
    Dalipi, Fisnik
    Kastrati, Zenun
    [J]. ICCAI '19 - PROCEEDINGS OF THE 2019 5TH INTERNATIONAL CONFERENCE ON COMPUTING AND ARTIFICIAL INTELLIGENCE, 2019, : 190 - 195
  • [46] GUIDE: Training Deep Graph Neural Networks via Guided Dropout Over Edges
    Wang, Jie
    Liang, Jianqing
    Liang, Jiye
    Yao, Kaixuan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4465 - 4477
  • [47] Mean Field Theory for Deep Dropout Networks: Digging up Gradient Backpropagation Deeply
    Huang, Wei
    Xu, Richard Yi Da
    Du, Weitao
    Zeng, Yutian
    Zhao, Yunce
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1215 - 1222
  • [48] Reliable Prediction Errors for Deep Neural Networks Using Test-Time Dropout
    Cortes-Ciriano, Isidro
    Bender, Andreas
    [J]. JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2019, 59 (07) : 3330 - 3339
  • [49] Random image frequency aggregation dropout in image classification for deep convolutional neural networks
    Nam, Ju-Hyeon
    Lee, Sang-Chul
    [J]. COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 232
  • [50] Uncertainty quantification of a deep learning model for failure rate prediction of water distribution networks
    Fan, Xudong
    Zhang, Xijin
    Yu, Xiong Bill
    [J]. RELIABILITY ENGINEERING & SYSTEM SAFETY, 2023, 236