A New Framework for Fine Tuning of Deep Networks

被引:9
|
作者
Wani, M. Arif [1 ]
Afzal, Saduf [1 ]
机构
[1] Univ Kashmir, Postgrad Dept Comp Sci, Srinagar, Jammu & Kashmir, India
关键词
Deep Learning; Deep Neural Networks; Fine Tuning; Drop out Technique; Gain Parameter and Drop Out Technique; VALIDITY INDEX; ALGORITHM;
D O I
10.1109/ICMLA.2017.0-135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Very often training of deep neural networks involves two learning phases: unsupervised pretraining and supervised fine tuning. Unsupervised pretraining is used to learn the parameters of deep neural networks, while as supervised fine tuning improves upon what has been learnt in the pretraining stage. The predominant algorithm that is used for supervised fine tuning of deep neural networks is standard backpropagation algorithm. However, in the field of shallow neural networks, a number of modifications to backpropagation algorithm have been proposed that have improved the performance of trained model. In this paper we propose a new framework that integrates gain parameter based backpropagation algorithm and the dropout technique and evaluate its effectiveness in the fine tuning of deep neural networks on three benchmark datasets. The results indicate that the proposed hybrid approach performs better fine tuning than backpropagation algorithm alone.
引用
收藏
页码:359 / 363
页数:5
相关论文
共 50 条
  • [41] GENERALIZING AND FINE TUNING TRIPLE-LOOP NETWORKS
    Lin, Bey-Chi
    Hwang, F. K.
    JOURNAL OF INTERCONNECTION NETWORKS, 2009, 10 (1-2) : 133 - 148
  • [42] A New Framework for Automatic Modulation Classification using Deep Belief Networks
    Ghasemzadeh, Pejman
    Banerjee, Subharthi
    Hempel, Michael
    Sharif, Hamid
    2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS (ICC WORKSHOPS), 2020,
  • [43] A proposed framework for fine-tuning/training a fuzzy controller
    Mjeda, Anila
    Jackman, Brendan
    Leen, Gabriel
    2006 IEEE CONFERENCE ON EMERGING TECHNOLOGIES & FACTORY AUTOMATION, VOLS 1 -3, 2006, : 1223 - +
  • [44] A fine-tuning deep learning framework to palliate data distribution shift effects in rotary machine fault detection
    Rezazadeh, Nima
    Perfetto, Donato
    de Oliveira, Mario
    De Luca, Alessandro
    Lamanna, Giuseppe
    STRUCTURAL HEALTH MONITORING-AN INTERNATIONAL JOURNAL, 2024,
  • [45] A New Fine-Tuning Argument for the Multiverse
    Simon Friederich
    Foundations of Physics, 2019, 49 : 1011 - 1021
  • [46] A New Fine-Tuning Argument for the Multiverse
    Friederich, Simon
    FOUNDATIONS OF PHYSICS, 2019, 49 (09) : 1011 - 1021
  • [47] Fine tuning deep learning models for breast tumor classification
    Heikal, Abeer
    El-Ghamry, Amir
    Elmougy, Samir
    Rashad, M. Z.
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [48] Probability Fusion Decision Framework of Multiple Deep Neural Networks for Fine-Grained Visual Classification
    Zheng, Yang-Yang
    Kong, Jian-Lei
    Jin, Xue-Bo
    Wang, Xiao-Yi
    Su, Ting-Li
    Wang, Jian-Li
    IEEE ACCESS, 2019, 7 : 122740 - 122757
  • [49] Fine-Tuning Quadcopter Control Parameters via Deep Actor-Critic Learning Framework: An Exploration of Nonlinear Stability Analysis and Intelligent Gain Tuning
    Moin, Hassan
    Shah, Umer Hameed
    Khan, Muhammad Jawad
    Sajid, Hasan
    IEEE ACCESS, 2024, 12 : 173462 - 173474
  • [50] Fine-Tuning Quadcopter Control Parameters via Deep Actor-Critic Learning Framework: An Exploration of Nonlinear Stability Analysis and Intelligent Gain Tuning
    Moin, Hassan
    Shah, Umer Hameed
    Khan, Muhammad Jawad
    Sajid, Hasan
    IEEE Access, 2024, 12 : 173462 - 173474