Multi-teacher knowledge extraction for prostate cancer recognition in intelligent medical assistance systems

被引:0
|
作者
Li, Linyuan [1 ]
Zhang, Qian [2 ]
Liu, Zhengqi [2 ]
Xi, Xinyi [2 ]
Zhang, Haonan [2 ]
Nan, Yahui [3 ]
Tu, Huijuan [4 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Northwest Univ, Sch Informat Sci & Technol, Xian 710127, Peoples R China
[3] Lvliang Univ, Dept Comp Sci & Technol, Luliang 033000, Peoples R China
[4] Kunshan Hosp Chinese Med, Dept Radiol, Suzhou 234099, Peoples R China
关键词
Transrectal ultrasound examination; knowledge distillation; multi teacher; intelligent system;
D O I
10.1142/S1793962325500035
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Designing intelligent diagnosis of prostate diseases in intelligent medical assistance systems has gradually become a research hotspot. However, rectal ultrasound (TRUS) as the main diagnostic tool for prostate diseases remains a challenging issue. (1) Due to limited prostate TRUS imaging data, it is difficult to train a robust deep learning model. (2) In terms of visual features, ultrasound images of prostate cancer are similar to TRUS images of other tissues and organs, so it is difficult for a single neural network model to accurately learn the feature representation of the disease. To address the above problems, we first establish a high-quality dataset for prostate TRUS imaging, and then design multi teacher knowledge distillation to achieve accurate disease recognition. The experimental results show that, compared with knowledge distillation without a teacher model and a single teacher model, knowledge distillation using multiple teacher models can significantly improve the accuracy of prostate TRUS image cancer prediction. As the number of teacher models increases, the accuracy rate is further improved, which verifies the effectiveness of this method in intelligent systems.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Correlation Guided Multi-teacher Knowledge Distillation
    Shi, Luyao
    Jiang, Ning
    Tang, Jialiang
    Huang, Xinlei
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
  • [2] Reinforced Multi-Teacher Selection for Knowledge Distillation
    Yuan, Fei
    Shou, Linjun
    Pei, Jian
    Lin, Wutao
    Gong, Ming
    Fu, Yan
    Jiang, Daxin
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
  • [3] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 566 - 570
  • [4] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION
    Zhang, Hailin
    Chen, Defang
    Wang, Can
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
  • [5] Multi-teacher knowledge distillation for compressed video action recognition based on deep learning
    Wu, Meng-Chieh
    Chiu, Ching-Te
    [J]. JOURNAL OF SYSTEMS ARCHITECTURE, 2020, 103
  • [6] Adaptive multi-teacher multi-level knowledge distillation
    Liu, Yuang
    Zhang, Wei
    Wang, Jun
    [J]. NEUROCOMPUTING, 2020, 415 : 106 - 113
  • [7] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature
    Fei, Hongxiao
    Tan, Yangying
    Huang, Wenti
    Long, Jun
    Huang, Jincai
    Yang, Liu
    [J]. WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
  • [8] MULTI-TEACHER KNOWLEDGE DISTILLATION FOR COMPRESSED VIDEO ACTION RECOGNITION ON DEEP NEURAL NETWORKS
    Wu, Meng-Chieh
    Chiu, Ching-Te
    Wu, Kun-Hsuan
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2202 - 2206
  • [9] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    [J]. IEEE Signal Processing Letters, 2024, 31 : 566 - 570
  • [10] Decoupled Multi-teacher Knowledge Distillation based on Entropy
    Cheng, Xin
    Tang, Jialiang
    Zhang, Zhiqiang
    Yu, Wenxin
    Jiang, Ning
    Zhou, Jinjia
    [J]. 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,