Classification of diabetic retinopathy using unlabeled data and knowledge distillation

被引:11
|
作者
Abbasi, Sajjad [1 ]
Hajabdollahi, Mohsen [1 ]
Khadivi, Pejman [2 ]
Karimi, Nader [1 ]
Roshandel, Roshanak [2 ]
Shirani, Shahram [3 ]
Samavi, Shadrokh [1 ,3 ]
机构
[1] Isfahan Univ Technol, Dept Elect & Comp Engn, Esfahan 841568311, Iran
[2] Seattle Univ, Comp Sci Dept, Seattle, WA 98122 USA
[3] McMaster Univ, Dept Elect & Comp Engn, Hamilton, ON L8S 4L8, Canada
关键词
Convolutional neural networks (CNN); Transfer learning; Knowledge distillation; Teacher-student model; Unlabeled data; Diabetic retinopathy; IMAGE COMPRESSION; NEURAL-NETWORKS;
D O I
10.1016/j.artmed.2021.102176
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Over the last decade, advances in Machine Learning and Artificial Intelligence have highlighted their potential as a diagnostic tool in the healthcare domain. Despite the widespread availability of medical images, their usefulness is severely hampered by a lack of access to labeled data. For example, while Convolutional Neural Networks (CNNs) have emerged as an essential analytical tool in image processing, their impact is curtailed by training limitations due to insufficient labeled data availability. Transfer Learning enables models developed for one task to be reused for a second task. Knowledge distillation enables transferring knowledge from a pre-trained model to another. However, it suffers from limitations, and the two models' constraints need to be architecturally similar. Knowledge distillation addresses some of the shortcomings of transfer learning by generalizing a complex model to a lighter model. However, some parts of the knowledge may not be distilled by knowledge distillation sufficiently. In this paper, a novel knowledge distillation approach using transfer learning is proposed. The proposed approach transfers the complete knowledge of a model to a new smaller one. Unlabeled data are used in an unsupervised manner to transfer the new smaller model's maximum amount of knowledge. The proposed method can be beneficial in medical image analysis, where labeled data are typically scarce. The proposed approach is evaluated in classifying images for diagnosing Diabetic Retinopathy on two publicly available datasets, including Messidor and EyePACS. Simulation results demonstrate that the approach effectively transfers knowledge from a complex model to a lighter one. Furthermore, experimental results illustrate that different small models' performance is improved significantly using unlabeled data and knowledge distillation.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] A Wrapped Approach Using Unlabeled Data for Diabetic Retinopathy Diagnosis
    Zhang, Xuefeng
    Kim, Youngsung
    Chung, Young-Chul
    Yoon, Sangcheol
    Rhee, Sang-Yong
    Kim, Yong Soo
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (03):
  • [2] Knowledge Distillation based Online Learning Methodology using Unlabeled Data Stream
    Seo, Sanghyun
    Park, Seongchul
    Jeong, Changhoon
    Kim, Juntae
    [J]. PROCEEDINGS OF THE 2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2018), 2018, : 68 - 71
  • [3] Toward Lightweight Diabetic Retinopathy Classification: A Knowledge Distillation Approach for Resource-Constrained Settings
    Islam, Niful
    Jony, Md. Mehedi Hasan
    Hasan, Emam
    Sutradhar, Sunny
    Rahman, Atikur
    Islam, Md. Motaharul
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (22):
  • [4] Knowledge Distillation Based on Positive-Unlabeled Classification and Attention Mechanism
    Tang, Jialiang
    Liu, Mingjin
    Jiang, Ning
    Yu, Wenxin
    Yang, Changzheng
    Zhou, Jinjia
    [J]. 2021 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2021,
  • [5] Learning Phrase Patterns for Text Classification Using a Knowledge Graph and Unlabeled Data
    Marin, Alex
    Holenstein, Roman
    Sarikaya, Ruhi
    Ostendorf, Mari
    [J]. 15TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2014), VOLS 1-4, 2014, : 253 - 257
  • [6] Recognition of Leukemic Retinopathy Using Knowledge of Diabetic Retinopathy
    Gilberto Platas-Campero, Edgar
    Diaz Hernandez, Raquel
    Altamirano Robles, Leopoldo
    [J]. PATTERN RECOGNITION, MCPR 2024, 2024, 14755 : 243 - 252
  • [7] Lesion-aware knowledge distillation for diabetic retinopathy lesion segmentation
    Yaqi Wang
    Qingshan Hou
    Peng Cao
    Jinzhu Yang
    Osmar R. Zaiane
    [J]. Applied Intelligence, 2024, 54 : 1937 - 1956
  • [8] Classification of Diabetic Retinopathy using Capsules
    Vanusha, D.
    Amutha, B.
    [J]. INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2021, 29 (06) : 835 - 854
  • [9] Lesion-aware knowledge distillation for diabetic retinopathy lesion segmentation
    Wang, Yaqi
    Hou, Qingshan
    Cao, Peng
    Yang, Jinzhu
    R. Zaiane, Osmar
    [J]. APPLIED INTELLIGENCE, 2024, 54 (02) : 1937 - 1956
  • [10] Automatic Diabetic Retinopathy Grading via Self-Knowledge Distillation
    Luo, Ling
    Xue, Dingyu
    Feng, Xinglong
    [J]. ELECTRONICS, 2020, 9 (09) : 1 - 13