More from Less: Self-supervised Knowledge Distillation for Routine Histopathology Data

被引:0
|
作者
Farndale, Lucas [1 ,2 ,3 ,4 ]
Insall, Robert [1 ,2 ,5 ]
Yuan, Ke [1 ,2 ,3 ]
机构
[1] Univ Glasgow, Sch Canc Sci, Glasgow, Lanark, Scotland
[2] Canc Res UK Beatson Inst, Glasgow, Lanark, Scotland
[3] Univ Glasgow, Sch Comp Sci, Glasgow, Lanark, Scotland
[4] Univ Glasgow, Sch Math & Stat, Glasgow, Lanark, Scotland
[5] UCL, Div Biosci, London, England
基金
英国工程与自然科学研究理事会; 英国惠康基金;
关键词
Representation Learning; Colon Cancer; Multi-Modality;
D O I
10.1007/978-3-031-45673-2_45
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Medical imaging technologies are generating increasingly large amounts of high-quality, information-dense data. Despite the progress, practical use of advanced imaging technologies for research and diagnosis remains limited by cost and availability, so more information-sparse data such as H&E stains are relied on in practice. The study of diseased tissue would greatly benefit from methods which can leverage these information-dense data to extract more value from routine, information-sparse data. Using self-supervised learning (SSL), we demonstrate that it is possible to distil knowledge during training from information-dense data into models which only require information-sparse data for inference. This improves downstream classification accuracy on information-sparse data, making it comparable with the fully-supervised baseline. We find substantial effects on the learned representations, and pairing with relevant data can be used to extract desirable features without the arduous process of manual labelling. This approach enables the design of models which require only routine images, but contain insights from state-of-the-art data, allowing better use of the available resources.
引用
收藏
页码:454 / 463
页数:10
相关论文
共 50 条
  • [1] Hierarchical Self-supervised Augmented Knowledge Distillation
    Yang, Chuanguang
    An, Zhulin
    Cai, Linhang
    Xu, Yongjun
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 1217 - 1223
  • [2] Self-supervised knowledge distillation in counterfactual learning for VQA
    Bi, Yandong
    Jiang, Huajie
    Zhang, Hanfu
    Hu, Yongli
    Yin, Baocai
    [J]. PATTERN RECOGNITION LETTERS, 2024, 177 : 33 - 39
  • [3] Self-supervised knowledge distillation for complementary label learning
    Liu, Jiabin
    Li, Biao
    Lei, Minglong
    Shi, Yong
    [J]. NEURAL NETWORKS, 2022, 155 : 318 - 327
  • [4] Self-supervised Knowledge Distillation Using Singular Value Decomposition
    Lee, Seung Hyun
    Kim, Dae Ha
    Song, Byung Cheol
    [J]. COMPUTER VISION - ECCV 2018, PT VI, 2018, 11210 : 339 - 354
  • [5] Poster: Self-Supervised Quantization-Aware Knowledge Distillation
    Zhao, Kaiqi
    Zhao, Ming
    [J]. 2023 IEEE/ACM SYMPOSIUM ON EDGE COMPUTING, SEC 2023, 2023, : 250 - 252
  • [6] Distill on the Go: Online knowledge distillation in self-supervised learning
    Bhat, Prashant
    Arani, Elahe
    Zonooz, Bahram
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2672 - 2681
  • [7] SSSD: Self-Supervised Self Distillation
    Chen, Wei-Chi
    Chu, Wei-Ta
    [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2769 - 2776
  • [8] A Novel Knowledge Distillation Method for Self-Supervised Hyperspectral Image Classification
    Chi, Qiang
    Lv, Guohua
    Zhao, Guixin
    Dong, Xiangjun
    [J]. REMOTE SENSING, 2022, 14 (18)
  • [9] Image quality assessment based on self-supervised learning and knowledge distillation
    Sang, Qingbing
    Shu, Ziru
    Liu, Lixiong
    Hu, Cong
    Wu, Qin
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2023, 90
  • [10] Self-Supervised Network Distillation for Exploration
    Zhang, Xu
    Dai, Ruiyu
    Chen, Weisi
    Qiu, Jiguang
    [J]. INTERNATIONAL JOURNAL OF PATTERN RECOGNITION AND ARTIFICIAL INTELLIGENCE, 2023, 37 (15)