A New Framework for Fine Tuning of Deep Networks

被引:9
|
作者
Wani, M. Arif [1 ]
Afzal, Saduf [1 ]
机构
[1] Univ Kashmir, Postgrad Dept Comp Sci, Srinagar, Jammu & Kashmir, India
关键词
Deep Learning; Deep Neural Networks; Fine Tuning; Drop out Technique; Gain Parameter and Drop Out Technique; VALIDITY INDEX; ALGORITHM;
D O I
10.1109/ICMLA.2017.0-135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Very often training of deep neural networks involves two learning phases: unsupervised pretraining and supervised fine tuning. Unsupervised pretraining is used to learn the parameters of deep neural networks, while as supervised fine tuning improves upon what has been learnt in the pretraining stage. The predominant algorithm that is used for supervised fine tuning of deep neural networks is standard backpropagation algorithm. However, in the field of shallow neural networks, a number of modifications to backpropagation algorithm have been proposed that have improved the performance of trained model. In this paper we propose a new framework that integrates gain parameter based backpropagation algorithm and the dropout technique and evaluate its effectiveness in the fine tuning of deep neural networks on three benchmark datasets. The results indicate that the proposed hybrid approach performs better fine tuning than backpropagation algorithm alone.
引用
收藏
页码:359 / 363
页数:5
相关论文
共 50 条
  • [21] Free Fine-tuning: A Plug-and-Play Watermarking Scheme for Deep Neural Networks
    Wang, Run
    Ren, Jixing
    Li, Boheng
    She, Tianyi
    Zhang, Wenhui
    Fang, Liming
    Chen, Jing
    Wang, Lina
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 8463 - 8474
  • [22] Fine-Tuning Deep Neural Networks and SVM to improve Vietnamese Herbal Medicine Classification
    Phuoc-Hai Huynh
    Thanh-Nghi Doan
    Lan Le Thi Thu
    Luyl-Da Quach
    PROCEEDINGS OF 2023 8TH INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION TECHNOLOGY, ICIIT 2023, 2023, : 199 - 202
  • [23] Robust Fine-Tuning of Deep Neural Networks with Hessian-based Generalization Guarantees
    Ju, Haotian
    Li, Dongyue
    Zhang, Hongyang R.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 10431 - 10461
  • [24] Gastric Pathology Image Classification Using Stepwise Fine-Tuning for Deep Neural Networks
    Qu, Jia
    Hiruta, Nobuyuki
    Terai, Kensuke
    Nosato, Hirokazu
    Murakawa, Masahiro
    Sakanashi, Hidenori
    JOURNAL OF HEALTHCARE ENGINEERING, 2018, 2018
  • [25] A new network forensic framework based on deep learning for Internet of Things networks: A particle deep framework
    Koroniotis, Nickolaos
    Moustafa, Nour
    Sitnikova, Elena
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2020, 110 : 91 - 106
  • [26] Fine-Tuning the Odds in Bayesian Networks
    Salmani, Bahare
    Katoen, Joost-Pieter
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, ECSQARU 2021, 2021, 12897 : 268 - 283
  • [27] Deep neural networks: a new framework for understanding biological vision?
    Leek, Charles
    PERCEPTION, 2021, 50 (06) : 573 - 573
  • [28] Framework welcome, but could do with fine-tuning
    Frédéric Sgard
    Nature, 2001, 412 : 375 - 375
  • [29] Fine-tuning Convolutional Neural Networks for fine art classification
    Cetinic, Eva
    Lipic, Tomislav
    Grgic, Sonja
    EXPERT SYSTEMS WITH APPLICATIONS, 2018, 114 : 107 - 118
  • [30] FINE-TUNING NEW CONTRIBUTIONS
    VALENTI, S
    FAIRBAUGH, R
    TROUP, T
    HOOTS, G
    SMITH, T
    INSTITUTIONAL INVESTOR, 1980, 14 (01): : 137 - 140