Fine-tuning the hyperparameters of pre-trained models for solving multiclass classification problems

被引:0
|
作者
Kaibassova, D. [1 ]
Nurtay, M. [1 ]
Tau, A. [1 ]
Kissina, M. [1 ]
机构
[1] Abylkas Saginov Karaganda Tech Univ, 56 N Nazarbayev Ave, Karaganda City 100000, Kazakhstan
关键词
multiclass classification; transfer learning; fine-tuning; CNN; image augmentation; X-ray;
D O I
10.18287/2412-6179-CO-1078
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
This study is devoted to the application of fine-tuning methods for Transfer Learning models to solve the multiclass image classification problem using the medical X-ray images. To achieve this goal, the structural features of such pre-trained models as VGG-19, ResNet-50, InceptionV3 were studied. For these models, the following fine-tuning methods were used: unfreezing the last convolutional layer and updating its weights, selecting the learning rate and optimizer. As a dataset chest X-Ray images of the Society for Imaging Informatics in Medicine (SIIM), as the leading healthcare organization in its field, in partnership with the Foundation for the Promotion of Health and Biomedical Research of Valencia Region (FISABIO), the Valencian Region Medical ImageBank (BIMCV)) and the Radiological Society of North America (RSNA) were used. Thus, the results of the experiments carried out illustrated that the pre-trained models with their subsequent tuning are excellent for solving the problem of multiclass classification in the field of medical image processing. It should be noted that ResNet-50 based model showed the best result with 82.74 % accuracy. Results obtained for all models are reflected in the corresponding tables.
引用
收藏
页码:971 / 979
页数:9
相关论文
共 50 条
  • [31] An Empirical Study of Parameter-Efficient Fine-Tuning Methods for Pre-trained Code Models
    Liu, Jiaxing
    Sha, Chaofeng
    Peng, Xin
    2023 38TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE, 2023, : 397 - 408
  • [32] APPT: Boosting Automated Patch Correctness Prediction via Fine-Tuning Pre-Trained Models
    Zhang, Quanjun
    Fang, Chunrong
    Sun, Weisong
    Liu, Yan
    He, Tieke
    Hao, Xiaodong
    Chen, Zhenyu
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, 2024, 50 (03) : 474 - 494
  • [33] Parameter-efficient fine-tuning of large-scale pre-trained language models
    Ding, Ning
    Qin, Yujia
    Yang, Guang
    Wei, Fuchao
    Yang, Zonghan
    Su, Yusheng
    Hu, Shengding
    Chen, Yulin
    Chan, Chi-Min
    Chen, Weize
    Yi, Jing
    Zhao, Weilin
    Wang, Xiaozhi
    Liu, Zhiyuan
    Zheng, Hai-Tao
    Chen, Jianfei
    Liu, Yang
    Tang, Jie
    Li, Juanzi
    Sun, Maosong
    NATURE MACHINE INTELLIGENCE, 2023, 5 (03) : 220 - +
  • [34] Fine-Tuning Pre-Trained Audio Models for Dysarthria Severity Classification: A Second Place Solution in the Multimodal Dysarthria Severity Classification Challenge
    Dai, Wei
    Li, Menglong
    He, Yingqi
    Zhu, Yongqiang
    2024 IEEE 14TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING, ISCSLP 2024, 2024, : 151 - 153
  • [35] An ensemble of pre-trained transformer models for imbalanced multiclass malware classification
    Demirkiran, Ferhat
    Cayir, Aykut
    Unal, Gur
    Dag, Hasan
    COMPUTERS & SECURITY, 2022, 121
  • [36] Novel Fine-Tuning Strategy on Pre-trained Protein Model Enhances ACP Functional Type Classification
    Wang, Shaokai
    Ma, Bin
    BIOINFORMATICS RESEARCH AND APPLICATIONS, PT I, ISBRA 2024, 2024, 14954 : 371 - 382
  • [37] SFMD: A Semi-supervised Framework for Pre-trained Language Models Fine-Tuning with Noisy Samples
    Yang, Yiwen
    Duan, Pengfei
    Li, Yongbing
    Zhang, Yifang
    Xiong, Shengwu
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT III, ICIC 2024, 2024, 14877 : 316 - 328
  • [38] TIBW: Task-Independent Backdoor Watermarking with Fine-Tuning Resilience for Pre-Trained Language Models
    Mo, Weichuan
    Chen, Kongyang
    Xiao, Yatie
    MATHEMATICS, 2025, 13 (02)
  • [39] Parameter-Efficient Fine-Tuning of Pre-trained Large Language Models for Financial Text Analysis
    Langa, Kelly
    Wang, Hairong
    Okuboyejo, Olaperi
    ARTIFICIAL INTELLIGENCE RESEARCH, SACAIR 2024, 2025, 2326 : 3 - 20
  • [40] ON FINE-TUNING PRE-TRAINED SPEECH MODELS WITH EMA-TARGET SELF-SUPERVISED LOSS
    Yang, Hejung
    Kang, Hong-Goo
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 6360 - 6364