Improving Regularization in Deep Neural Networks by Co-adaptation Trace Detection

被引:4
|
作者
Moayed, Hojjat [1 ]
Mansoori, Eghbal G. [1 ]
机构
[1] Shiraz Univ, Sch Elect & Comp Engn, Shiraz, Iran
关键词
Deep neural networks; Regularization; Dropout; Co-adaptation; Under-; over-dropping; DROPOUT;
D O I
10.1007/s11063-023-11293-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Co-adaptation of units is one of the most critical concerns in deep neural networks (DNNs), which leads to overfitting. Dropout has been an exciting research subject in recent years to prevent overfitting. In previous studies, the dropout probability keeps fixed or changes simply during training epochs. However, we have no evidence which proves that the co-adaptation is uniformly distributed among units of models. Therefore, Dropout with identical probability for all units leads to an imbalance regularization and the under-/over-dropping problem. This paper proposes DropCT, a variant of Dropout that can detect co-adaptation traces (CTs) among units using the label propagation algorithm in community detection. It determines the DropCT probability of units in a CT according to its co-adaptation pressure. Therefore, DropCT applies a dynamic regularization to avoid under-/over-dropping. DropCT can integrate with different architectures as a general regularization method. Experimental results confirm that DropCT improves generalization and is comparatively simple to apply without tuning regularization hyperparameters.
引用
收藏
页码:7985 / 7997
页数:13
相关论文
共 50 条
  • [11] Regularization of deep neural networks with spectral dropout
    Khan, Salman H.
    Hayat, Munawar
    Porikli, Fatih
    NEURAL NETWORKS, 2019, 110 : 82 - 90
  • [12] Sparse synthesis regularization with deep neural networks
    Obmann, Daniel
    Schwab, Johannes
    Haltmeier, Markus
    2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [13] Group sparse regularization for deep neural networks
    Scardapane, Simone
    Comminiello, Danilo
    Hussain, Amir
    Uncini, Aurelio
    NEUROCOMPUTING, 2017, 241 : 81 - 89
  • [14] LocalDrop: A Hybrid Regularization for Deep Neural Networks
    Lu, Ziqing
    Xu, Chang
    Du, Bo
    Ishida, Takashi
    Zhang, Lefei
    Sugiyama, Masashi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3590 - 3601
  • [15] A Comparison of Regularization Techniques in Deep Neural Networks
    Nusrat, Ismoilov
    Jang, Sung-Bong
    SYMMETRY-BASEL, 2018, 10 (11):
  • [16] Data-efficient Co-Adaptation of Morphology and Behaviour with Deep Reinforcement Learning
    Luck, Kevin Sebastian
    Ben Amor, Heni
    Calandra, Roberto
    CONFERENCE ON ROBOT LEARNING, VOL 100, 2019, 100
  • [17] Improving BCI performance through co-adaptation: Applications to the P300-speller
    Mattout, Jeremie
    Perrin, Margaux
    Bertrand, Olivier
    Maby, Emmanuel
    ANNALS OF PHYSICAL AND REHABILITATION MEDICINE, 2015, 58 (01) : 23 - 28
  • [18] Co-Adaptation of Algorithmic and Implementational Innovations in Inference-based Deep Reinforcement Learning
    Furuta, Hiroki
    Kozuno, Tadashi
    Matsushima, Tatsuya
    Matsuo, Yutaka
    Gu, Shixiang Shane
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [19] Deep neural networks regularization for structured output prediction
    Belharbi, Soufiane
    Herault, Romain
    Chatelain, Clement
    Adam, Sebastien
    NEUROCOMPUTING, 2018, 281 : 169 - 177
  • [20] Adaptive Knowledge Driven Regularization for Deep Neural Networks
    Luo, Zhaojing
    Cai, Shaofeng
    Cui, Can
    Ooi, Beng Chin
    Yang, Yang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8810 - 8818