DEEP INHOMOGENEOUS REGULARIZATION FOR TRANSFER LEARNING

被引:0
|
作者
Wang, Wen [1 ]
Zhai, Wei [1 ]
Cao, Yang [1 ]
机构
[1] Univ Sci & Technol China, Hefei 230027, Peoples R China
基金
中国国家自然科学基金;
关键词
Transfer Learning; Regularization; Negative Transfer; Catastrophic Forgetting; Deep Learning;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Fine-tuning is an effective transfer learning method to achieve ideal performance on target task with limited training data. Some recent works regularize parameters of deep neural networks for better knowledge transfer. However, these methods enforce homogeneous penalties for all parameters, resulting in catastrophic forgetting or negative transfer. To address this problem, we propose a novel Inhomogeneous Regularization (IR) method that imposes a strong regularization on parameters of transferable convolutional filters to tackle catastrophic forgetting and alleviate the regularization on parameters of less transferable filters to tackle negative transfer. Moreover, we use the decaying averaged deviation of parameters from the start point (pre-trained parameters) to accurately measure the transferability of each filter. Evaluation on the three challenging benchmarks datasets has demonstrated the superiority of the proposed model against state-of-the-art methods.
引用
收藏
页码:221 / 225
页数:5
相关论文
共 50 条
  • [31] Quantum deep transfer learning
    Wang, Longhan
    Sun, Yifan
    Zhang, Xiangdong
    NEW JOURNAL OF PHYSICS, 2021, 23 (10)
  • [32] Deep Transfer Metric Learning
    Hu, Junlin
    Lu, Jiwen
    Tan, Yap-Peng
    Zhou, Jie
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2016, 25 (12) : 5576 - 5588
  • [33] A Survey on Deep Transfer Learning
    Tan, Chuanqi
    Sun, Fuchun
    Kong, Tao
    Zhang, Wenchang
    Yang, Chao
    Liu, Chunfang
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT III, 2018, 11141 : 270 - 279
  • [34] Deep Transfer Metric Learning
    Hu, Junlin
    Lu, Jiwen
    Tan, Yap-Peng
    2015 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2015, : 325 - 333
  • [35] Combining Explicit and Implicit Regularization for Efficient Learning in Deep Networks
    Zhao, Dan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [36] Deep Learning Meets Sparse Regularization: A signal processing perspective
    Parhi R.
    Nowak R.D.
    IEEE Signal Processing Magazine, 2023, 40 (06) : 63 - 74
  • [37] Adaptive Regularization via Residual Smoothing in Deep Learning Optimization
    Cho, Junghee
    Kwon, Junseok
    Hong, Byung-Woo
    IEEE ACCESS, 2019, 7 : 122889 - 122899
  • [38] Multi-level Distance Regularization for Deep Metric Learning
    Kim, Yonghyun
    Park, Wonpyo
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 1827 - 1835
  • [39] Variational Characterizations of Local Entropy and Heat Regularization in Deep Learning
    Trillos, Nicolas Garcia
    Kaplan, Zachary
    Sanz-Alonso, Daniel
    ENTROPY, 2019, 21 (05)
  • [40] Loan Default Prediction with Deep Learning and Muddling Label Regularization
    Jiang, Weiwei
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (07) : 1340 - 1342