Self supervised contrastive learning for digital histopathology

被引:162
|
作者
Ciga, Ozan [1 ,4 ]
Xu, Tony [2 ]
Martel, Anne Louise [1 ,3 ]
机构
[1] Univ Toronto, Dept Med Biophys, Toronto, ON, Canada
[2] Univ British Columbia, Dept Elect & Comp Engn, 5500-2332 Main Mall, Vancouver, BC V6T 1Z4, Canada
[3] Sunnybrook Res Inst, Phys Sci, Toronto, ON, Canada
[4] Sunnybrook Hlth Sci Ctr, 2075 Bayview Ave,M6 609, Toronto, ON M4N 3M5, Canada
来源
基金
加拿大自然科学与工程研究理事会;
关键词
Self supervised learning; Digital histopathology; Whole slide images; Unsupervised learning; SPARSE AUTOENCODER; CANCER; CLASSIFICATION; NUCLEI; IMAGES;
D O I
10.1016/j.mlwa.2021.100198
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised learning has been a long-standing goal of machine learning and is especially important for medical image analysis, where the learning can compensate for the scarcity of labeled datasets. A promising subclass of unsupervised learning is self -supervised learning, which aims to learn salient features using the raw input as the learning signal. In this work, we tackle the issue of learning domain -specific features without any supervision to improve multiple task performances that are of interest to the digital histopathology community. We apply a contrastive self -supervised learning method to digital histopathology by collecting and pretraining on 57 histopathology datasets without any labels. We find that combining multiple multi -organ datasets with different types of staining and resolution properties improves the quality of the learned features. Furthermore, we find using more images for pretraining leads to a better performance in multiple downstream tasks, albeit there are diminishing returns as more unlabeled images are incorporated into the pretraining. Linear classifiers trained on top of the learned features show that networks pretrained on digital histopathology datasets perform better than ImageNet pretrained networks, boosting task performances by more than 28% in F 1 scores on average. Interestingly, we did not observe a consistent correlation between the pretraining dataset site or the organ versus the downstream task (e.g., pretraining with only breast images does not necessarily lead to a superior downstream task performance for breast -related tasks). These findings may also be useful when applying newer contrastive techniques to histopathology data. Pretrained PyTorch models are made publicly available at https://github.com/ozanciga/self-supervised-histopathology.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] Contrastive UCB: Provably Efficient Contrastive Self-Supervised Learning in Online Reinforcement Learning
    Qiu, Shuang
    Wang, Lingxiao
    Bai, Chenjia
    Yang, Zhuoran
    Wang, Zhaoran
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [22] Supervised Contrastive Learning
    Khosla, Prannay
    Teterwak, Piotr
    Wang, Chen
    Sarna, Aaron
    Tian, Yonglong
    Isola, Phillip
    Maschinot, Aaron
    Liu, Ce
    Krishnan, Dilip
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [23] DimCL: Dimensional Contrastive Learning for Improving Self-Supervised Learning
    Nguyen, Thanh
    Pham, Trung Xuan
    Zhang, Chaoning
    Luu, Tung M.
    Vu, Thang
    Yoo, Chang D.
    IEEE ACCESS, 2023, 11 : 21534 - 21545
  • [24] Investigating Contrastive Pair Learning's Frontiers in Supervised, Semisupervised, and Self-Supervised Learning
    Sabiri, Bihi
    Khtira, Amal
    EL Asri, Bouchra
    Rhanoui, Maryem
    JOURNAL OF IMAGING, 2024, 10 (08)
  • [25] Classification of breast cancer histopathology images using a modified supervised contrastive learning method
    Sani, Matina Mahdizadeh
    Royat, Ali
    Baghshah, Mahdieh Soleymani
    MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 2024, : 721 - 731
  • [26] Similarity contrastive estimation for image and video soft contrastive self-supervised learning
    Denize, Julien
    Rabarisoa, Jaonary
    Orcesi, Astrid
    Herault, Romain
    MACHINE VISION AND APPLICATIONS, 2023, 34 (06)
  • [27] Self-Supervised Contrastive Learning for Volcanic Unrest Detection
    Bountos, Nikolaos Ioannis
    Papoutsis, Ioannis
    Michail, Dimitrios
    Anantrasirichai, Nantheera
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [28] Self-Supervised Contrastive Learning In Spiking Neural Networks
    Bahariasl, Yeganeh
    Kheradpisheh, Saeed Reza
    PROCEEDINGS OF THE 13TH IRANIAN/3RD INTERNATIONAL MACHINE VISION AND IMAGE PROCESSING CONFERENCE, MVIP, 2024, : 181 - 185
  • [29] Self-supervised Contrastive Learning for Predicting Game Strategies
    Lee, Young Jae
    Baek, Insung
    Jo, Uk
    Kim, Jaehoon
    Bae, Jinsoo
    Jeong, Keewon
    Kim, Seoung Bum
    INTELLIGENT SYSTEMS AND APPLICATIONS, VOL 1, 2023, 542 : 136 - 147
  • [30] Contrasting Contrastive Self-Supervised Representation Learning Pipelines
    Kotar, Klemen
    Ilharco, Gabriel
    Schmidt, Ludwig
    Ehsani, Kiana
    Mottaghi, Roozbeh
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9929 - 9939