Fine-tuning the hyperparameters of pre-trained models for solving multiclass classification problems

被引:0
|
作者
Kaibassova, D. [1 ]
Nurtay, M. [1 ]
Tau, A. [1 ]
Kissina, M. [1 ]
机构
[1] Abylkas Saginov Karaganda Tech Univ, 56 N Nazarbayev Ave, Karaganda City 100000, Kazakhstan
关键词
multiclass classification; transfer learning; fine-tuning; CNN; image augmentation; X-ray;
D O I
10.18287/2412-6179-CO-1078
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
This study is devoted to the application of fine-tuning methods for Transfer Learning models to solve the multiclass image classification problem using the medical X-ray images. To achieve this goal, the structural features of such pre-trained models as VGG-19, ResNet-50, InceptionV3 were studied. For these models, the following fine-tuning methods were used: unfreezing the last convolutional layer and updating its weights, selecting the learning rate and optimizer. As a dataset chest X-Ray images of the Society for Imaging Informatics in Medicine (SIIM), as the leading healthcare organization in its field, in partnership with the Foundation for the Promotion of Health and Biomedical Research of Valencia Region (FISABIO), the Valencian Region Medical ImageBank (BIMCV)) and the Radiological Society of North America (RSNA) were used. Thus, the results of the experiments carried out illustrated that the pre-trained models with their subsequent tuning are excellent for solving the problem of multiclass classification in the field of medical image processing. It should be noted that ResNet-50 based model showed the best result with 82.74 % accuracy. Results obtained for all models are reflected in the corresponding tables.
引用
收藏
页码:971 / 979
页数:9
相关论文
共 50 条
  • [1] Span Fine-tuning for Pre-trained Language Models
    Bao, Rongzhou
    Zhang, Zhuosheng
    Zhao, Hai
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1970 - 1979
  • [2] Waste Classification by Fine-Tuning Pre-trained CNN and GAN
    Alsabei, Amani
    Alsayed, Ashwaq
    Alzahrani, Manar
    Al-Shareef, Sarah
    INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2021, 21 (08): : 65 - 70
  • [3] Fine-Tuning Pre-Trained Language Models with Gaze Supervision
    Deng, Shuwen
    Prasse, Paul
    Reich, David R.
    Scheffer, Tobias
    Jager, Lena A.
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 217 - 224
  • [4] Fine-tuning Pre-trained Models for Robustness under Noisy Labels
    Ahn, Sumyeong
    Kim, Sihyeon
    Ko, Jongwoo
    Yun, Se-Young
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 3643 - 3651
  • [5] Debiasing Pre-Trained Language Models via Efficient Fine-Tuning
    Gira, Michael
    Zhang, Ruisu
    Lee, Kangwook
    PROCEEDINGS OF THE SECOND WORKSHOP ON LANGUAGE TECHNOLOGY FOR EQUALITY, DIVERSITY AND INCLUSION (LTEDI 2022), 2022, : 59 - 69
  • [6] Exploiting Syntactic Information to Boost the Fine-tuning of Pre-trained Models
    Liu, Chaoming
    Zhu, Wenhao
    Zhang, Xiaoyu
    Zhai, Qiuhong
    2022 IEEE 46TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE (COMPSAC 2022), 2022, : 575 - 582
  • [7] Gender-tuning: Empowering Fine-tuning for Debiasing Pre-trained Language Models
    Ghanbarzadeh, Somayeh
    Huang, Yan
    Palangi, Hamid
    Moreno, Radames Cruz
    Khanpour, Hamed
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5448 - 5458
  • [8] Pruning Pre-trained Language ModelsWithout Fine-Tuning
    Jiang, Ting
    Wang, Deqing
    Zhuang, Fuzhen
    Xie, Ruobing
    Xia, Feng
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 594 - 605
  • [9] Pathologies of Pre-trained Language Models in Few-shot Fine-tuning
    Chen, Hanjie
    Zheng, Guoqing
    Awadallah, Ahmed Hassan
    Ji, Yangfeng
    PROCEEDINGS OF THE THIRD WORKSHOP ON INSIGHTS FROM NEGATIVE RESULTS IN NLP (INSIGHTS 2022), 2022, : 144 - 153
  • [10] Revisiting k-NN for Fine-Tuning Pre-trained Language Models
    Li, Lei
    Chen, Jing
    Tian, Botzhong
    Zhang, Ningyu
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2023, 2023, 14232 : 327 - 338