Knowledge distillation approach for skin cancer classification on lightweight deep learning model

被引:0
|
作者
Saha, Suman [1 ]
Hemal, Md. Moniruzzaman [1 ]
Eidmum, Md. Zunead Abedin [1 ]
Mridha, Muhammad Firoz [2 ]
机构
[1] Department of IoT and Robotics Engineering, Bangabandhu Sheikh Mujibur Rahman Digital University, Bangladesh, Gazipur, Bangladesh
[2] Department of Computer Science, American International University-Bangladesh, Dhaka, Bangladesh
关键词
Adversarial machine learning - Contrastive Learning - Deep learning - Diseases - Oncology - Personnel training - Students;
D O I
10.1049/htl2.12120
中图分类号
学科分类号
摘要
Over the past decade, there has been a global increase in the incidence of skin cancers. Skin cancer has serious consequences if left untreated, potentially leading to more advanced cancer stages. In recent years, deep learning based convolutional neural network have emerged as powerful tools for skin cancer detection. Generally, deep learning approaches are computationally expensive and require large storage space. Therefore, deploying such a large complex model on resource-constrained devices is challenging. An ultra-light and accurate deep learning model is highly desirable for better inference time and memory in low-power-consuming devices. Knowledge distillation is an approach for transferring knowledge from a large network to a small network. This small network is easily compatible with resource-constrained embedded devices while maintaining accuracy. The main aim of this study is to develop a deep learning-based lightweight network based on knowledge distillation that identifies the presence of skin cancer. Here, different training strategies are implemented for the modified benchmark (Phase 1) and custom-made model (Phase 2) and demonstrated various distillation configurations on two datasets: HAM10000 and ISIC2019. In Phase 1, the student model using knowledge distillation achieved accuracies ranging from 88.69% to 93.24% for HAM10000 and from 82.14% to 84.13% on ISIC2019. In Phase 2, the accuracies ranged from 88.63% to 88.89% on HAM10000 and from 81.39% to 83.42% on ISIC2019. These results highlight the effectiveness of knowledge distillation in improving the classification performance across diverse datasets and enabling the student model to approach the performance of the teacher model. In addition, the distilled student model can be easily deployed on resource-constrained devices for automated skin cancer detection due to its lower computational complexity. © 2025 The Author(s). Healthcare Technology Letters published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology.
引用
收藏
相关论文
共 50 条
  • [41] Deep Ensemble Learning by Diverse Knowledge Distillation for Fine-Grained Object Classification
    Okamoto, Naoki
    Hirakawa, Tsubasa
    Yamashita, Takayoshi
    Fujiyoshi, Hironobu
    COMPUTER VISION, ECCV 2022, PT XI, 2022, 13671 : 502 - 518
  • [42] Utilizing Knowledge Distillation in Deep Learning for Classification of Chest X-Ray Abnormalities
    Ho, Thi Kieu Khanh
    Gwak, Jeonghwan
    IEEE ACCESS, 2020, 8 : 160749 - 160761
  • [43] A deep learning model for the classification of lung cancer
    Toyokawa, G.
    Kanavati, F.
    Momosaki, S.
    Tateishi, K.
    Takeoka, H.
    Okamoto, M.
    Yamazaki, K.
    Takeo, S.
    Iizuka, O.
    Tsuneki, M.
    ANNALS OF ONCOLOGY, 2020, 31 : S1381 - S1381
  • [44] Lightweight Deep Learning Classification Model for Identifying Low-Resolution CT Images of Lung Cancer
    Marappan, Shanmugasundaram
    Mujib, Muhammad Danish
    Siddiqui, Adnan Ahmed
    Aziz, Abdul
    Khan, Samiullah
    Singh, Mahesh
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [45] A lightweight deep learning network based on knowledge distillation for applications of efficient crack segmentation on embedded devices
    Chen, Jun
    Liu, Ye
    Hou, Jia-ao
    STRUCTURAL HEALTH MONITORING-AN INTERNATIONAL JOURNAL, 2023, 22 (05): : 3027 - 3046
  • [46] A lightweight GAN-based fault diagnosis method based on knowledge distillation and deep transfer learning
    Zhong, Hongyu
    Yu, Samson
    Trinh, Hieu
    Yuan, Rui
    Lv, Yong
    Wang, Yanan
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (03)
  • [47] Knowledge distillation in deep learning and its applications
    Alkhulaifi, Abdolmaged
    Alsahli, Fahad
    Ahmad, Irfan
    PEERJ COMPUTER SCIENCE, 2021, PeerJ Inc. (07) : 1 - 24
  • [48] Lightweight convolutional neural network with knowledge distillation for cervical cells classification
    Chen, Wen
    Gao, Liang
    Li, Xinyu
    Shen, Weiming
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [49] A robust deep learning framework for multiclass skin cancer classification
    Ozdemir, Burhanettin
    Pacal, Ishak
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [50] An Enhanced Deep Learning Method for Skin Cancer Detection and Classification
    El-Soud, Mohamed W. Abo
    Gaber, Tarek
    Tahoun, Mohamed
    Alourani, Abdullah
    CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 73 (01): : 1109 - 1123