Transfer Learning based Performance Comparison of the Pre-Trained Deep Neural Networks

被引:0
|
作者
Kumar, Jayapalan Senthil [1 ]
Anuar, Syahid [1 ]
Hassan, Noor Hafizah [1 ]
机构
[1] Univ Teknol Malaysia UTM, Razak Fac Technol & Informat, Kuala Lumpur 54100, Malaysia
关键词
Transfer learning; deep neural networks; image classification; Convolutional Neural Network (CNN) models; CLASSIFICATION;
D O I
10.14569/IJACSA.2022.0130193
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Deep learning has grown tremendously in recent years, having a substantial impact on practically every discipline. Transfer learning allows us to transfer the knowledge of a model that has been formerly trained for a particular task to a new model that is attempting to solve a related but not identical problem. Specific layers of a pre-trained model must be retrained while the others must remain unmodified to adapt it to a new task effectively. There are typical issues in selecting the layers to be enabled for training and layers to be frozen, setting hyper parameter values, and all these concerns have a substantial effect on training capabilities as well as classification performance. The principal aim of this study is to compare the network performance of the selected pre-trained models based on transfer learning to help the selection of a suitable model for image classification. To accomplish the goal, we examined the performance of five pre-trained networks, such as SqueezeNet, GoogleNet, ShuffleNet, Darknet-53, and Inception-V3 with different Epochs, Learning Rates, and Mini-Batch Sizes to compare and evaluate the network's performance using confusion matrix. Based on the experimental findings, Inception-V3 has achieved the highest accuracy of 96.98%, as well as other evaluation metrics, including precision, sensitivity, specificity, and f1-score of 92.63%, 92.46%, 98.12%, and 92.49%, respectively.
引用
收藏
页码:797 / 805
页数:9
相关论文
共 50 条
  • [11] Lithography Hotspot Detection Method Based on Transfer Learning Using Pre-Trained Deep Convolutional Neural Network
    Liao, Lufeng
    Li, Sikun
    Che, Yongqiang
    Shi, Weijie
    Wang, Xiangzhao
    APPLIED SCIENCES-BASEL, 2022, 12 (04):
  • [12] Following the Leader using a Tracking System based on Pre-trained Deep Neural Networks
    Mutz, Filipe
    Cardoso, Vinicius
    Teixeira, Thomas
    Jesus, Luan F. R.
    Golcalves, Michael A.
    Guidolini, Ranik
    Oliveira, Josias
    Badue, Claudine
    De Souza, Alberto F.
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 4332 - 4339
  • [13] Transfer Learning Effects on Image Steganalysis with Pre-Trained Deep Residual Neural Network Model
    Ozcan, Selim
    Mustacoglu, Ahmet Fatih
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 2280 - 2287
  • [14] Detecting Deceptive Utterances Using Deep Pre-Trained Neural Networks
    Wawer, Aleksander
    Sarzynska-Wawer, Justyna
    APPLIED SCIENCES-BASEL, 2022, 12 (12):
  • [15] Semantic Segmentation of Mammograms Using Pre-Trained Deep Neural Networks
    Prates, Rodrigo Leite
    Gomez-Flores, Wilfrido
    Pereira, Wagner
    2021 18TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, COMPUTING SCIENCE AND AUTOMATIC CONTROL (CCE 2021), 2021,
  • [16] Harnessing Pre-Trained Neural Networks with Rules for Formality Style Transfer
    Wang, Yunli
    Wu, Yu
    Mou, Lili
    Li, Zhoujun
    Chao, Wenhan
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3573 - 3578
  • [17] Backdoor Attacks Against Transfer Learning With Pre-Trained Deep Learning Models
    Wang, Shuo
    Nepal, Surya
    Rudolph, Carsten
    Grobler, Marthie
    Chen, Shangyu
    Chen, Tianle
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2022, 15 (03) : 1526 - 1539
  • [18] Can Deep Learning Find the Ischemic Core on CT? Transfer Learning From Pre-Trained MRI-Based Networks
    Yu, Yannan
    Christensen, Soren
    Xie, Yuan
    Gong, Enhao
    Lansberg, Maarten G.
    Albers, Greg
    Zaharchuk, Greg
    STROKE, 2021, 52
  • [19] Attentional Masking for Pre-trained Deep Networks
    Wallenberg, Marcus
    Forssen, Per-Erik
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 6149 - 6154
  • [20] Performance Improvement Of Pre-trained Convolutional Neural Networks For Action Recognition
    Ozcan, Tayyip
    Basturk, Alper
    COMPUTER JOURNAL, 2021, 64 (11): : 1715 - 1730