A Dropout Distribution Model on Deep Networks

被引:0
|
作者
Li, Fengqi [1 ]
Yang, Helin [1 ]
机构
[1] Dalian Univ Technol, Sch Software Technol, Econ & Technol Dev Area, Dalian 116620, Peoples R China
关键词
Deep neural networks; Dropout; Rate's distribution;
D O I
10.1117/12.2243971
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
Dropout is proved to have a good ability of controlling overfitting and improving deep networks' generalization. However, dropout adopts a constant rate to train the parameters of each layer, reducing the classification accuracy and efficiency. Aiming at this problem, the paper proposes a dropout rate distribution model by analyzing the relationship between the dropout rate and the layers of the deep network. First, we gave the formal description of the dropout rate to reveal the relationship between the dropout rate and the layers of the deep network. Second, we proposed a distribution model for determining the dropout rate in each layer training. Experiments are performed on MNIST and CIFAR-10 datasets to evaluate the performance of the proposed model by comparison with networks of constant dropout rates. Experimental results demonstrate that our proposed model performs better than the conventional dropout in classification accuracy and efficiency.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] IMPROVING DEEP NEURAL NETWORKS BY USING SPARSE DROPOUT STRATEGY
    Zheng, Hao
    Chen, Mingming
    Liu, Wenju
    Yang, Zhanlei
    Liang, Shan
    [J]. 2014 IEEE CHINA SUMMIT & INTERNATIONAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (CHINASIP), 2014, : 21 - 26
  • [22] Generating Diverse Translation from Model Distribution with Dropout
    Wu, Xuanfu
    Feng, Yang
    Shao, Chenze
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1088 - 1097
  • [23] Deterministic dropout for deep neural networks using composite random forest
    Santra, Bikash
    Paul, Angshuman
    Mukherjee, Dipti Prasad
    [J]. PATTERN RECOGNITION LETTERS, 2020, 131 : 205 - 212
  • [24] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING DROPOUT AND SHRINKING STRUCTURE
    Zhang, Shiliang
    Bao, Yebo
    Zhou, Pan
    Jiang, Hui
    Dai, Lirong
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [25] ISING DROPOUT WITH NODE GROUPING FOR TRAINING AND COMPRESSION OF DEEP NEURAL NETWORKS
    Salehinejad, Hojjat
    Wang, Zijian
    Valaee, Shahrokh
    [J]. 2019 7TH IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (IEEE GLOBALSIP), 2019,
  • [26] EDropout: Energy-Based Dropout and Pruning of Deep Neural Networks
    Salehinejad, Hojjat
    Valaee, Shahrokh
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5279 - 5292
  • [27] Defensive Dropout for Hardening Deep Neural Networks under Adversarial Attacks
    Wang, Siyue
    Wang, Xiao
    Zhao, Pu
    Wen, Wujie
    Kaeli, David
    Chin, Peter
    Lin, Xue
    [J]. 2018 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN (ICCAD) DIGEST OF TECHNICAL PAPERS, 2018,
  • [28] Beyond Dropout: Feature Map Distortion to Regularize Deep Neural Networks
    Tang, Yehui
    Wang, Yunhe
    Xu, Yixing
    Shi, Boxin
    Xu, Chao
    Xu, Chunjing
    Xu, Chang
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5964 - 5971
  • [29] Less Is More: Adaptive Trainable Gradient Dropout for Deep Neural Networks
    Avgerinos, Christos
    Vretos, Nicholas
    Daras, Petros
    [J]. SENSORS, 2023, 23 (03)
  • [30] Adaptive sparse dropout: Learning the certainty and uncertainty in deep neural networks
    Chen, Yuanyuan
    Yi, Zhang
    [J]. NEUROCOMPUTING, 2021, 450 : 354 - 361