Twin Contrastive Learning with Noisy Labels

被引:20
|
作者
Huang, Zhizhong [1 ]
Zhang, Junping [1 ]
Shan, Hongming [2 ,3 ,4 ]
机构
[1] Fudan Univ, Shanghai Key Lab Intelligent Informat Proc, Sch Comp Sci, Shanghai 200433, Peoples R China
[2] Fudan Univ, Inst Sci & Technol Brain Inspired Intelligence, Shanghai 200433, Peoples R China
[3] Fudan Univ, MOE Frontiers Ctr Brain Sci, Shanghai 200433, Peoples R China
[4] Shanghai Ctr Brain Sci & Brain Inspired Technol, Shanghai 200031, Peoples R China
关键词
D O I
10.1109/CVPR52729.2023.01122
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning from noisy data is a challenging task that significantly degenerates the model performance. In this paper, we present TCL, a novel twin contrastive learning model to learn robust representations and handle noisy labels for classification. Specifically, we construct a Gaussian mixture model (GMM) over the representations by injecting the supervised model predictions into GMM to link label-free latent variables in GMM with label-noisy annotations. Then, TCL detects the examples with wrong labels as the out-of-distribution examples by another two-component GMM, taking into account the data distribution. We further propose a cross-supervision with an entropy regularization loss that bootstraps the true targets from model predictions to handle the noisy labels. As a result, TCL can learn discriminative representations aligned with estimated labels through mixup and contrastive learning. Extensive experimental results on several standard benchmarks and real-world datasets demonstrate the superior performance of TCL. In particular, TCL achieves 7.5% improvements on CIFAR-10 with 90% noisy label-an extremely noisy scenario. The source code is available at https://github.com/Hzzone/TCL.
引用
下载
收藏
页码:11661 / 11670
页数:10
相关论文
共 50 条
  • [1] On Learning Contrastive Representations for Learning with Noisy Labels
    Yi, Li
    Liu, Sheng
    She, Qi
    McLeod, A. Ian
    Wang, Boyu
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16661 - 16670
  • [2] Supervised contrastive learning with corrected labels for noisy label learning
    Ouyang, Jihong
    Lu, Chenyang
    Wang, Bing
    Li, Changchun
    APPLIED INTELLIGENCE, 2023, 53 (23) : 29378 - 29392
  • [3] Supervised contrastive learning with corrected labels for noisy label learning
    Jihong Ouyang
    Chenyang Lu
    Bing Wang
    Changchun Li
    Applied Intelligence, 2023, 53 : 29378 - 29392
  • [4] A Framework Using Contrastive Learning for Classification with Noisy Labels
    Ciortan, Madalina
    Dupuis, Romain
    Peel, Thomas
    DATA, 2021, 6 (06)
  • [5] Selective-Supervised Contrastive Learning with Noisy Labels
    Li, Shikun
    Xia, Xiaobo
    Ge, Shiming
    Liu, Tongliang
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 316 - 325
  • [6] NCMatch: Semi-supervised Learning with Noisy Labels via Noisy Sample Filter and Contrastive Learning
    Sun, Yuanbo
    Gao, Can
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 15 - 27
  • [7] Learning with noisy labels using collaborative sample selection and contrastive semi
    Miao, Qing
    Wu, Xiaohe
    Xu, Chao
    Ji, Yanli
    Zuo, Wangmeng
    Guo, Yiwen
    Meng, Zhaopeng
    KNOWLEDGE-BASED SYSTEMS, 2024, 296
  • [8] ECLB: Efficient contrastive learning on bi-level for noisy labels
    Guan, Juwei
    Liu, Jiaxiang
    Huang, Shuying
    Yang, Yong
    KNOWLEDGE-BASED SYSTEMS, 2024, 300
  • [9] Triple Contrastive Representation Learning for Hyperspectral Image Classification With Noisy Labels
    Zhang, Xinyu
    Yang, Shuyuan
    Feng, Zhixi
    Song, Liangliang
    Wei, Yantao
    Jiao, Licheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [10] Contrastive Learning Joint Regularization for Pathological Image Classification with Noisy Labels
    Guo, Wenping
    Han, Gang
    Mo, Yaling
    Zhang, Haibo
    Fang, Jiangxiong
    Zhao, Xiaoming
    ELECTRONICS, 2024, 13 (13)