Boosting semi-supervised learning with Contrastive Complementary Labeling

被引:3
|
作者
Deng, Qinyi [1 ]
Guo, Yong [1 ]
Yang, Zhibang [1 ]
Pan, Haolin [1 ]
Chen, Jian [1 ]
机构
[1] South China Univ Technol, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Semi-supervised learning; Contrastive learning; Complementary labels;
D O I
10.1016/j.neunet.2023.11.052
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Semi-supervised learning (SSL) approaches have achieved great success in leveraging a large amount of unlabeled data to learn deep models. Among them, one popular approach is pseudo-labeling which generates pseudo labels only for those unlabeled data with high-confidence predictions. As for the low-confidence ones, existing methods often simply discard them because these unreliable pseudo labels may mislead the model. Unlike existing methods, we highlight that these low-confidence data can be still beneficial to the training process. Specifically, although we cannot determine which class a low-confidence sample belongs to, we can assume that this sample should be very unlikely to belong to those classes with the lowest probabilities (often called complementary classes/labels). Inspired by this, we propose a novel Contrastive Complementary Labeling (CCL) method that constructs a large number of reliable negative pairs based on the complementary labels and adopts contrastive learning to make use of all the unlabeled data. Extensive experiments demonstrate that CCL significantly improves the performance on top of existing advanced methods and is particularly effective under the label-scarce settings. For example, CCL yields an improvement of 2.43% over FixMatch on CIFAR-10 only with 40 labeled data.
引用
收藏
页码:417 / 426
页数:10
相关论文
共 50 条
  • [31] Pseudo Contrastive Learning for graph-based semi-supervised learning
    Lu, Weigang
    Guan, Ziyu
    Zhao, Wei
    Yang, Yaming
    Lv, Yuanhai
    Xing, Lining
    Yu, Baosheng
    Tao, Dacheng
    NEUROCOMPUTING, 2025, 624
  • [32] MetaCL: a semi-supervised meta learning architecture via contrastive learning
    Li, Chengyang
    Xie, Yongqiang
    Li, Zhongbo
    Zhu, Liping
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (02) : 227 - 236
  • [33] Tracking with Context as a Semi-supervised Learning and Labeling Problem
    Cerman, Lukas
    Hlavac, Vaclav
    2012 21ST INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR 2012), 2012, : 2124 - 2127
  • [34] Dynamic graph convolutional networks by semi-supervised contrastive learning
    Zhang, Guolin
    Hu, Zehui
    Wen, Guoqiu
    Ma, Junbo
    Zhu, Xiaofeng
    PATTERN RECOGNITION, 2023, 139
  • [35] Active semi-supervised learning with multiple complementary information
    Park, Sung Ho
    Kim, Seoung Bum
    EXPERT SYSTEMS WITH APPLICATIONS, 2019, 126 : 30 - 40
  • [36] Adversarial Dense Contrastive Learning for Semi-Supervised Semantic Segmentation
    Wang, Ying
    Xuan, Ziwei
    Ho, Chiuman
    Qi, Guo-Jun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 4459 - 4471
  • [37] SSCL: Semi-supervised Contrastive Learning for Industrial Anomaly Detection
    Cai, Wei
    Gao, Jiechao
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IV, 2024, 14428 : 100 - 112
  • [38] Semi-Supervised Graph Contrastive Learning With Virtual Adversarial Augmentation
    Dong, Yixiang
    Luo, Minnan
    Li, Jundong
    Liu, Ziqi
    Zheng, Qinghua
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (08) : 4232 - 4244
  • [39] Semi-Supervised Contrastive Learning With Similarity Co-Calibration
    Zhang, Yuhang
    Zhang, Xiaopeng
    Li, Jie
    Qiu, Robert C.
    Xu, Haohang
    Tian, Qi
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 1749 - 1759
  • [40] Semi-Supervised Group Emotion Recognition Based on Contrastive Learning
    Zhang, Jiayi
    Wang, Xingzhi
    Zhang, Dong
    Lee, Dah-Jye
    ELECTRONICS, 2022, 11 (23)