Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation

被引:0
|
作者
Wang, Qixuan [1 ]
Zhang, Yanjun [2 ]
Lu, Jun [2 ]
Li, Congsheng [1 ]
Zhang, Yungang [2 ]
机构
[1] China Acad Informat & Commun Technol, Beijing, Peoples R China
[2] Capital Med Univ, Beijing Chao Yang Hosp, Dept Pathol, Beijing 100020, Peoples R China
来源
PHYSICS IN MEDICINE AND BIOLOGY | 2024年 / 69卷 / 18期
关键词
lung adenocarcinoma; histopathology; whole slide image; image classification; semi-supervised learning; multi-teacher knowledge distillation; ASSOCIATION; PATTERN; CANCER;
D O I
10.1088/1361-6560/ad7454
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Objective. In this study, we propose a semi-supervised learning (SSL) scheme using a patch-based deep learning (DL) framework to tackle the challenge of high-precision classification of seven lung tumor growth patterns, despite having a small amount of labeled data in whole slide images (WSIs). This scheme aims to enhance generalization ability with limited data and reduce dependence on large amounts of labeled data. It effectively addresses the common challenge of high demand for labeled data in medical image analysis. Approach. To address these challenges, the study employs a SSL approach enhanced by a dynamic confidence threshold mechanism. This mechanism adjusts based on the quantity and quality of pseudo labels generated. This dynamic thresholding mechanism helps avoid the imbalance of pseudo-label categories and the low number of pseudo-labels that may result from a higher fixed threshold. Furthermore, the research introduces a multi-teacher knowledge distillation (MTKD) technique. This technique adaptively weights predictions from multiple teacher models to transfer reliable knowledge and safeguard student models from low-quality teacher predictions. Main results. The framework underwent rigorous training and evaluation using a dataset of 150 WSIs, each representing one of the seven growth patterns. The experimental results demonstrate that the framework is highly accurate in classifying lung tumor growth patterns in histopathology images. Notably, the performance of the framework is comparable to that of fully supervised models and human pathologists. In addition, the framework's evaluation metrics on a publicly available dataset are higher than those of previous studies, indicating good generalizability. Significance. This research demonstrates that a SSL approach can achieve results comparable to fully supervised models and expert pathologists, thus opening new possibilities for efficient and cost-effective medical images analysis. The implementation of dynamic confidence thresholding and MTKD techniques represents a significant advancement in applying DL to complex medical image analysis tasks. This advancement could lead to faster and more accurate diagnoses, ultimately improving patient outcomes and fostering the overall progress of healthcare technology.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Semi-supervised Semantic Segmentation with Mutual Knowledge Distillation
    Yuan, Jianlong
    Ge, Jinchao
    Wang, Zhibin
    Liu, Yifan
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5436 - 5444
  • [42] Markov random field based fusion for supervised and semi-supervised multi-modal image classification
    Xie, Liang
    Pan, Peng
    Lu, Yansheng
    MULTIMEDIA TOOLS AND APPLICATIONS, 2015, 74 (02) : 613 - 634
  • [43] Markov random field based fusion for supervised and semi-supervised multi-modal image classification
    Liang Xie
    Peng Pan
    Yansheng Lu
    Multimedia Tools and Applications, 2015, 74 : 613 - 634
  • [44] Semi-supervised multi-label image classification based on nearest neighbor editing
    Wei, Zhihua
    Wang, Hanli
    Zhao, Rui
    NEUROCOMPUTING, 2013, 119 : 462 - 468
  • [45] Multi-teacher knowledge distillation for compressed video action recognition based on deep learning
    Wu, Meng-Chieh
    Chiu, Ching-Te
    JOURNAL OF SYSTEMS ARCHITECTURE, 2020, 103
  • [46] Visual emotion analysis using skill-based multi-teacher knowledge distillation
    Cladiere, Tristan
    Alata, Olivier
    Ducottet, Christophe
    Konik, Hubert
    Legrand, Anne-Claire
    PATTERN ANALYSIS AND APPLICATIONS, 2025, 28 (02)
  • [47] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation
    Yang, Yang
    Liu, Dan
    2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319
  • [48] Named Entity Recognition Method Based on Multi-Teacher Collaborative Cyclical Knowledge Distillation
    Jin, Chunqiao
    Yang, Shuangyuan
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 230 - 235
  • [49] Semi-Supervised Blind Image Quality Assessment through Knowledge Distillation and Incremental Learning
    Pan, Wensheng
    Gao, Timin
    Zhang, Yan
    Zheng, Xiawu
    Shen, Yunhang
    Li, Ke
    Hu, Runze
    Liu, Yutao
    Dai, Pingyang
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4388 - 4396
  • [50] A semi-supervised network based on feature embeddings for image classification
    Nuhoho, Raphael Elimeli
    Chen Wenyu
    Baffour, Adu Asare
    EXPERT SYSTEMS, 2022, 39 (04)