A New Framework for Fine Tuning of Deep Networks

被引:9
|
作者
Wani, M. Arif [1 ]
Afzal, Saduf [1 ]
机构
[1] Univ Kashmir, Postgrad Dept Comp Sci, Srinagar, Jammu & Kashmir, India
关键词
Deep Learning; Deep Neural Networks; Fine Tuning; Drop out Technique; Gain Parameter and Drop Out Technique; VALIDITY INDEX; ALGORITHM;
D O I
10.1109/ICMLA.2017.0-135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Very often training of deep neural networks involves two learning phases: unsupervised pretraining and supervised fine tuning. Unsupervised pretraining is used to learn the parameters of deep neural networks, while as supervised fine tuning improves upon what has been learnt in the pretraining stage. The predominant algorithm that is used for supervised fine tuning of deep neural networks is standard backpropagation algorithm. However, in the field of shallow neural networks, a number of modifications to backpropagation algorithm have been proposed that have improved the performance of trained model. In this paper we propose a new framework that integrates gain parameter based backpropagation algorithm and the dropout technique and evaluate its effectiveness in the fine tuning of deep neural networks on three benchmark datasets. The results indicate that the proposed hybrid approach performs better fine tuning than backpropagation algorithm alone.
引用
收藏
页码:359 / 363
页数:5
相关论文
共 50 条
  • [1] Fine-tuning Deep Belief Networks using Harmony Search
    Papa, Joao Paulo
    Scheirer, Walter
    Cox, David Daniel
    APPLIED SOFT COMPUTING, 2016, 46 : 875 - 885
  • [2] No Fine-Tuning, No Cry: Robust SVD for Compressing Deep Networks
    Tukan, Murad
    Maalouf, Alaa
    Weksler, Matan
    Feldman, Dan
    SENSORS, 2021, 21 (16)
  • [3] Event Recognition on Images by Fine-Tuning of Deep Neural Networks
    Yudin, Dmitry
    Zeno, Bassel
    PROCEEDINGS OF THE SECOND INTERNATIONAL SCIENTIFIC CONFERENCE INTELLIGENT INFORMATION TECHNOLOGIES FOR INDUSTRY (IITI'17), VOL 1, 2018, 679 : 479 - 487
  • [4] Twice fine-tuning deep neural networks for paraphrase identification
    Ko, Bowon
    Choi, Ho-Jin
    ELECTRONICS LETTERS, 2020, 56 (09) : 444 - 446
  • [5] Fine-Tuning Deep Neural Networks in Continuous Learning Scenarios
    Kaeding, Christoph
    Rodner, Erik
    Freytag, Alexander
    Denzler, Joachim
    COMPUTER VISION - ACCV 2016 WORKSHOPS, PT III, 2017, 10118 : 588 - 605
  • [6] Quaternion-based Deep Belief Networks fine-tuning
    Papa, Joao Paulo
    Rosa, Gustavo H.
    Pereira, Danillo R.
    Yang, Xin-She
    APPLIED SOFT COMPUTING, 2017, 60 : 328 - 335
  • [7] Joint Fine-Tuning in Deep Neural Networks for Facial Expression Recognition
    Jung, Heechul
    Lee, Sihaeng
    Yim, Junho
    Park, Sunjeong
    Kim, Junmo
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 2983 - 2991
  • [8] Comparison of Fine-Tuning and Extension Strategies for Deep Convolutional Neural Networks
    Pittaras, Nikiforos
    Markatopoulou, Foteini
    Mezaris, Vasileios
    Patras, Ioannis
    MULTIMEDIA MODELING (MMM 2017), PT I, 2017, 10132 : 102 - 114
  • [9] Fine-Tuning Based Deep Convolutional Networks for Lepidopterous Genus Recognition
    Carvajal, Juan A.
    Romero, Dennis G.
    Sappa, Angel D.
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2016, 2017, 10125 : 467 - 475
  • [10] Sparse Mutation Decompositions: Fine Tuning Deep Neural Networks with Subspace Evolution
    Whitaker, Tim
    Whitley, Darrell
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 695 - 698