Understanding and Mitigating Human-Labelling Errors in Supervised Contrastive Learning

被引:0
|
作者
Long, Zijun [1 ]
Zhuang, Lipeng [1 ]
Killick, George [1 ]
McCreadie, Richard [1 ]
Aragon-Camarasa, Gerardo [1 ]
Henderson, Paul [1 ]
机构
[1] Univ Glasgow, Glasgow, Lanark, Scotland
来源
关键词
Contrastive Learning; Label Noise; Image Classification;
D O I
10.1007/978-3-031-72949-2_25
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human-annotated vision datasets inevitably contain a fraction of human-mislabelled examples. While the detrimental effects of such mislabelling on supervised learning are well-researched, their influence on Supervised Contrastive Learning (SCL) remains largely unexplored. In this paper, we show that human-labelling errors not only differ significantly from synthetic label errors, but also pose unique challenges in SCL, different to those in traditional supervised learning methods. Specifically, our results indicate they adversely impact the learning process in the similar to 99% of cases when they occur as false positive samples. Existing noise-mitigating methods primarily focus on synthetic label errors and tackle the unrealistic setting of very high synthetic noise rates (40-80%), but they often underperform on common image datasets due to overfitting. To address this issue, we introduce a novel SCL objective with robustness to human-labelling errors, SCL-RHE. SCL-RHE is designed to mitigate the effects of real-world mislabelled examples, typically characterized by much lower noise rates (<5%). We demonstrate that SCL-RHE consistently outperforms state-of-the-art representation learning and noise-mitigating methods across various vision benchmarks, by offering improved resilience against human-labelling errors.
引用
收藏
页码:435 / 454
页数:20
相关论文
共 50 条
  • [41] Supervised Contrastive Learning for Facial Kinship Recognition
    Zhang, Ximiao
    Xu, Min
    Zhou, Xiuzhuang
    Guo, Guodong
    2021 16TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2021), 2021,
  • [42] Adversarial Self-Supervised Contrastive Learning
    Kim, Minseon
    Tack, Jihoon
    Hwang, Sung Ju
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020), 2020, 33
  • [43] Contrastive Regularization for Semi-Supervised Learning
    Lee, Doyup
    Kim, Sungwoong
    Kim, Ildoo
    Cheon, Yeongjae
    Cho, Minsu
    Han, Wook-Shin
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 3910 - 3919
  • [44] Deep contrastive representation learning for supervised tasks
    Duan, Chenguang
    Jiao, Yuling
    Kang, Lican
    Yang, Jerry Zhijian
    Zhou, Fusheng
    PATTERN RECOGNITION, 2025, 161
  • [45] A Survey on Contrastive Self-Supervised Learning
    Jaiswal, Ashish
    Babu, Ashwin Ramesh
    Zadeh, Mohammad Zaki
    Banerjee, Debapriya
    Makedon, Fillia
    TECHNOLOGIES, 2021, 9 (01)
  • [46] JGCL: Joint Self-Supervised and Supervised Graph Contrastive Learning
    Akkas, Selahattin
    Azad, Ariful
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 1099 - 1105
  • [47] Supervised contrastive learning with corrected labels for noisy label learning
    Ouyang, Jihong
    Lu, Chenyang
    Wang, Bing
    Li, Changchun
    APPLIED INTELLIGENCE, 2023, 53 (23) : 29378 - 29392
  • [48] ComCo: Complementary supervised contrastive learning for complementary label learning
    Jiang, Haoran
    Sun, Zhihao
    Tian, Yingjie
    NEURAL NETWORKS, 2024, 169 : 44 - 56
  • [49] Contrastive UCB: Provably Efficient Contrastive Self-Supervised Learning in Online Reinforcement Learning
    Qiu, Shuang
    Wang, Lingxiao
    Bai, Chenjia
    Yang, Zhuoran
    Wang, Zhaoran
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [50] Supervised contrastive learning with corrected labels for noisy label learning
    Jihong Ouyang
    Chenyang Lu
    Bing Wang
    Changchun Li
    Applied Intelligence, 2023, 53 : 29378 - 29392