Clustering-Guided Twin Contrastive Learning for Endomicroscopy Image Classification

被引:0
|
作者
Zhou, Jingjun [1 ]
Dong, Xiangjiang [2 ]
Liu, Qian [1 ,3 ]
机构
[1] Hainan Univ, Sch Biomed Engn, Haikou 570228, Peoples R China
[2] Huazhong Univ Sci & Technol, Wuhan Natl Lab Optoelect, Wuhan 430074, Peoples R China
[3] Hainan Univ, Sch Biomed Engn, Key Lab Biomed Engn Hainan Prov, Haikou 570228, Peoples R China
基金
中国国家自然科学基金;
关键词
Clustering; contrastive learning; image classification and gastrointestinal; probe-based confocal laser endomicroscopy (pCLE); CONFOCAL LASER ENDOMICROSCOPY; SAFETY;
D O I
10.1109/JBHI.2024.3366223
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Learning better representations is essential in medical image analysis for computer-aided diagnosis. However, learning discriminative semantic features is a major challenge due to the lack of large-scale well-annotated datasets. Thus, how can we learn a well-structured categorizable embedding space in limited-scale and unlabeled datasets? In this paper, we proposed a novel clustering-guided twin-contrastive learning framework (CTCL) that learns the discriminative representations of probe-based confocal laser endomicroscopy (pCLE) images for gastrointestinal (GI) tumor classification. Compared with traditional contrastive learning, in which only two randomly augmented views of the same instance are considered, the proposed CTCL aligns more semantically related and class-consistent samples by clustering, which improved intra-class tightness and inter-class variability to produce more informative representations. Furthermore, based on the inherent properties of CLE (geometric invariance and intrinsic noise), we proposed to regard CLE images with any angle rotation and CLE images with different noises as the same instance, respectively, for increased variability and diversity of samples. By optimizing CTCL in an end-to-end expectation-maximization framework, comprehensive experimental results demonstrated that CTCL-based visual representations achieved competitive performance on each downstream task as well as more robustness and transferability compared with existing state-of-the-art SSL and supervised methods. Notably, CTCL achieved 75.60%/78.45% and 64.12%/77.37% top-1 accuracy on the linear evaluation protocol and few-shot classification downstream tasks, respectively, which outperformed the previous best results by 1.27%/1.63% and 0.5%/3%, respectively. The proposed method holds great potential to assist pathologists in achieving an automated, fast, and high-precision diagnosis of GI tumors and accurately determining different stages of tumor development based on CLE images.
引用
收藏
页码:2879 / 2890
页数:12
相关论文
共 50 条
  • [41] Renal Pathological Image Classification Based on Contrastive and Transfer Learning
    Liu, Xinkai
    Zhu, Xin
    Tian, Xingjian
    Iwasaki, Tsuyoshi
    Sato, Atsuya
    Kazama, Junichiro James
    ELECTRONICS, 2024, 13 (07)
  • [42] Supervised Contrastive Learning-Based Classification for Hyperspectral Image
    Huang, Lingbo
    Chen, Yushi
    He, Xin
    Ghamisi, Pedram
    REMOTE SENSING, 2022, 14 (21)
  • [43] IRST-CGSeg: Infrared Small Target Detection Based on Clustering-Guided Graph Learning and Hierarchical Features
    Jia, Guimin
    Chen, Tao
    Cheng, Yu
    Lu, Pengyu
    ELECTRONICS, 2025, 14 (05):
  • [44] Leukocyte classification using relative-relationship-guided contrastive learning
    Li, Zuoyong
    Lin, Qinghua
    Wu, Jiawei
    Lai, Taotao
    Wu, Rongteng
    Zhang, David
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 260
  • [45] SCHG: Spectral Clustering-guided Hypergraph Neural Networks for Multi-view Semi-supervised Learning
    Wu, Yuze
    Lan, Shiyang
    Cai, Zhiling
    Fu, Mingjian
    Li, Jinbo
    Wang, Shiping
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 277
  • [46] Deep Multiview Clustering by Pseudo-Label Guided Contrastive Learning and Dual Correlation Learning
    Hu, Shizhe
    Zhang, Chengkun
    Zou, Guoliang
    Lou, Zhengzheng
    Ye, Yangdong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 3646 - 3658
  • [47] Pyramid contrastive learning for clustering
    Zhou, Zi-Feng
    Huang, Dong
    Wang, Chang-Dong
    NEURAL NETWORKS, 2025, 185
  • [48] Supporting Clustering with Contrastive Learning
    Zhang, Dejiao
    Nan, Feng
    Wei, Xiaokai
    Li, Shang-Wen
    Zhu, Henghui
    McKeown, Kathleen
    Nallapati, Ramesh
    Arnold, Andrew O.
    Xiang, Bing
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5419 - 5430
  • [49] PRIOR GUIDED FUNDUS IMAGE QUALITY ENHANCEMENT VIA CONTRASTIVE LEARNING
    Cheng, Pujin
    Lin, Li
    Huang, Yijin
    Lyu, Junyan
    Tang, Xiaoying
    2021 IEEE 18TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2021, : 521 - 525
  • [50] DUAL CONTRASTIVE LEARNING GUIDED PATHOLOGICAL IMAGE RE-STAINING
    Liang, Yuexiao
    Chen, Zhineng
    Chen, Xin
    Jia, Caiyan
    Ye, Xiongjun
    Gao, Xieping
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 1881 - 1885