Universum-Inspired Supervised Contrastive Learning

被引:3
|
作者
Han, Aiyang [1 ]
Chen, Songcan [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Nanjing, Peoples R China
来源
关键词
Mixup; Contrastive learning; Supervised learning; Universum;
D O I
10.1007/978-3-031-25198-6_34
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixup is an efficient data augmentation method which generates additional samples through respective convex combinations of original data points and labels. Although being theoretically dependent on data properties, Mixup is reported to perform well as a regularizer and calibrator contributing reliable robustness and generalization to neural network training. In this paper, inspired by Universum Learning which uses out-of-class samples to assist the target tasks, we investigate Mixup from a largely under-explored perspective - the potential to generate in-domain samples that belong to none of the target classes, that is, universum. We find that in the framework of supervised contrastive learning, universum-style Mixup produces surprisingly high-quality hard negatives, greatly relieving the need for a large batch size in contrastive learning. With these findings, we propose Universum-inspired Contrastive learning (UniCon), which incorporates Mixup strategy to generate universum data as g-negatives and pushes them apart from anchor samples of the target classes. Our approach not only improves Mixup with hard labels, but also innovates a novel measure to generate universum data. With a linear classifier on the learned representations, on Resnet-50, our method achieves 81.68% top-1 accuracy on CIFAR-100, surpassing the state of art by a significant margin of 5% with a much smaller batch size.
引用
收藏
页码:459 / 473
页数:15
相关论文
共 50 条
  • [31] Deep contrastive representation learning for supervised tasks
    Duan, Chenguang
    Jiao, Yuling
    Kang, Lican
    Yang, Jerry Zhijian
    Zhou, Fusheng
    PATTERN RECOGNITION, 2025, 161
  • [32] A Survey on Contrastive Self-Supervised Learning
    Jaiswal, Ashish
    Babu, Ashwin Ramesh
    Zadeh, Mohammad Zaki
    Banerjee, Debapriya
    Makedon, Fillia
    TECHNOLOGIES, 2021, 9 (01)
  • [33] Online Class Incremental Contrastive Learning Based on Incremental Mixup-induced Universum
    Liu, Yu-Wei
    Chen, Song-Can
    Ruan Jian Xue Bao/Journal of Software, 2024, 35 (12): : 5544 - 5557
  • [34] JGCL: Joint Self-Supervised and Supervised Graph Contrastive Learning
    Akkas, Selahattin
    Azad, Ariful
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 1099 - 1105
  • [35] Supervised contrastive learning with corrected labels for noisy label learning
    Ouyang, Jihong
    Lu, Chenyang
    Wang, Bing
    Li, Changchun
    APPLIED INTELLIGENCE, 2023, 53 (23) : 29378 - 29392
  • [36] ComCo: Complementary supervised contrastive learning for complementary label learning
    Jiang, Haoran
    Sun, Zhihao
    Tian, Yingjie
    NEURAL NETWORKS, 2024, 169 : 44 - 56
  • [37] Contrastive UCB: Provably Efficient Contrastive Self-Supervised Learning in Online Reinforcement Learning
    Qiu, Shuang
    Wang, Lingxiao
    Bai, Chenjia
    Yang, Zhuoran
    Wang, Zhaoran
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [38] Supervised contrastive learning with corrected labels for noisy label learning
    Jihong Ouyang
    Chenyang Lu
    Bing Wang
    Changchun Li
    Applied Intelligence, 2023, 53 : 29378 - 29392
  • [39] Augmenting Few-Shot Learning With Supervised Contrastive Learning
    Lee, Taemin
    Yoo, Sungjoo
    IEEE ACCESS, 2021, 9 : 61466 - 61474
  • [40] CONTRASTIVE HEARTBEATS: CONTRASTIVE LEARNING FOR SELF-SUPERVISED ECG REPRESENTATION AND PHENOTYPING
    Wei, Crystal T.
    Hsieh, Ming-En
    Liu, Chien-Liang
    Tseng, Vincent S.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 1126 - 1130