Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

被引:0
|
作者
Noh, Hyeonwoo [1 ]
You, Tackgeun [1 ]
Mun, Jonghwan [1 ]
Han, Bohyung [1 ]
机构
[1] POSTECH, Dept Comp Sci & Engn, Pohang, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Overfitting is one of the most critical challenges in deep neural networks, and there are various types of regularization methods to improve generalization performance. Injecting noises to hidden units during training, e.g., dropout, is known as a successful regularizer, but it is still not clear enough why such training techniques work well in practice and how we can maximize their benefit in the presence of two conflicting objectives-optimizing to true data distribution and preventing overfitting by regularization. This paper addresses the above issues by 1) interpreting that the conventional training methods with regularization by noise injection optimize the lower bound of the true objective and 2) proposing a technique to achieve a tighter lower bound using multiple noise samples per training example in a stochastic gradient descent iteration. We demonstrate the effectiveness of our idea in several computer vision applications.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Ghost Noise for Regularizing Deep Neural Networks
    Kosson, Atli
    Fan, Dongyang
    Jaggi, Martin
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 12, 2024, : 13274 - 13282
  • [2] A Kernel Perspective for Regularizing Deep Neural Networks
    Bietti, Alberto
    Mialon, Gregoire
    Chen, Dexiong
    Mairal, Julien
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [3] Regularizing Iterative Reconstruction with Deep Neural Networks
    Ozaki, S.
    Kaji, S.
    Haga, A.
    Nawa, K.
    Ohta, T.
    Nozawa, Y.
    Nakamoto, T.
    Imae, T.
    Nakagawa, K.
    MEDICAL PHYSICS, 2020, 47 (06) : E437 - E438
  • [4] Dropout with Tabu Strategy for Regularizing Deep Neural Networks
    Ma, Zongjie
    Sattar, Abdul
    Zhou, Jun
    Chen, Qingliang
    Su, Kaile
    COMPUTER JOURNAL, 2020, 63 (07): : 1031 - 1038
  • [5] Regularizing Deep Neural Networks by Enhancing Diversity in Feature Extraction
    Ayinde, Babajide O.
    Inanc, Tamer
    Zurada, Jacek M.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (09) : 2650 - 2661
  • [6] Regularizing Deep Neural Networks with Stochastic Estimators of Hessian Trace
    Liu, Yucong
    Yu, Shixing
    Lin, Tong
    arXiv, 2022,
  • [7] Regularizing Deep Convolutional Neural Networks with a Structured Decorrelation Constraint
    Xiong, Wei
    Du, Bo
    Zhang, Lefei
    Hu, Ruimin
    Tao, Dacheng
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 519 - 528
  • [8] Regularizing the effect of input noise injection in feedforward neural networks training
    Abd-Krim Seghouane
    Yassir Moudden
    Gilles Fleury
    Neural Computing & Applications, 2004, 13 : 248 - 254
  • [9] Regularizing the effect of input noise injection in feedforward neural networks training
    Seghouane, AK
    Moudden, Y
    Fleury, G
    NEURAL COMPUTING & APPLICATIONS, 2004, 13 (03): : 248 - 254
  • [10] Regularizing Deep Neural Networks with an Ensemble-based Decorrelation Method
    Gu, Shuqin
    Hou, Yuexian
    Zhang, Lipeng
    Zhang, Yazhou
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2177 - 2183