Universum-Inspired Supervised Contrastive Learning

被引:3
|
作者
Han, Aiyang [1 ]
Chen, Songcan [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Nanjing, Peoples R China
来源
关键词
Mixup; Contrastive learning; Supervised learning; Universum;
D O I
10.1007/978-3-031-25198-6_34
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixup is an efficient data augmentation method which generates additional samples through respective convex combinations of original data points and labels. Although being theoretically dependent on data properties, Mixup is reported to perform well as a regularizer and calibrator contributing reliable robustness and generalization to neural network training. In this paper, inspired by Universum Learning which uses out-of-class samples to assist the target tasks, we investigate Mixup from a largely under-explored perspective - the potential to generate in-domain samples that belong to none of the target classes, that is, universum. We find that in the framework of supervised contrastive learning, universum-style Mixup produces surprisingly high-quality hard negatives, greatly relieving the need for a large batch size in contrastive learning. With these findings, we propose Universum-inspired Contrastive learning (UniCon), which incorporates Mixup strategy to generate universum data as g-negatives and pushes them apart from anchor samples of the target classes. Our approach not only improves Mixup with hard labels, but also innovates a novel measure to generate universum data. With a linear classifier on the learned representations, on Resnet-50, our method achieves 81.68% top-1 accuracy on CIFAR-100, surpassing the state of art by a significant margin of 5% with a much smaller batch size.
引用
收藏
页码:459 / 473
页数:15
相关论文
共 50 条
  • [1] Universum-Inspired Supervised Contrastive Learning
    Han, Aiyang
    Geng, Chuanxing
    Chen, Songcan
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 4275 - 4286
  • [2] Supervised Contrastive Learning
    Khosla, Prannay
    Teterwak, Piotr
    Wang, Chen
    Sarna, Aaron
    Tian, Yonglong
    Isola, Phillip
    Maschinot, Aaron
    Liu, Ce
    Krishnan, Dilip
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] Semi-Supervised Text Classification With Universum Learning
    Liu, Chien-Liang
    Hsaio, Wen-Hoar
    Lee, Chia-Hoang
    Chang, Tao-Hsing
    Kuo, Tsung-Hsun
    IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (02) : 462 - 473
  • [4] Supervised contrastive learning for recommendation
    Yang, Chun
    Zou, Jianxiao
    Wu, JianHua
    Xu, Hongbing
    Fan, Shicai
    KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [5] Adversarial supervised contrastive learning
    Li, Zhuorong
    Yu, Daiwei
    Wu, Minghui
    Jin, Canghong
    Yu, Hongchuan
    MACHINE LEARNING, 2023, 112 (06) : 2105 - 2130
  • [6] Adversarial supervised contrastive learning
    Zhuorong Li
    Daiwei Yu
    Minghui Wu
    Canghong Jin
    Hongchuan Yu
    Machine Learning, 2023, 112 : 2105 - 2130
  • [7] Supervised Spatially Contrastive Learning
    Nakashima, Kodai
    Kataoka, Hirokatsu
    Iwata, Kenji
    Suzuki, Ryota
    Satoh, Yutaka
    Seimitsu Kogaku Kaishi/Journal of the Japan Society for Precision Engineering, 2022, 88 (01): : 66 - 71
  • [8] Weakly Supervised Contrastive Learning
    Zheng, Mingkai
    Wang, Fei
    You, Shan
    Qian, Chen
    Zhang, Changshui
    Wang, Xiaogang
    Xu, Chang
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10022 - 10031
  • [9] Selecting Informative Universum Sample for Semi-Supervised Learning
    Chen, Shuo
    Zhang, Changshui
    21ST INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-09), PROCEEDINGS, 2009, : 1016 - 1021
  • [10] Supervised Contrastive Learning for Afect Modelling
    Pinitas, Kosmas
    Makantasis, Konstantinos
    Liapis, Antonios
    Yannakakis, Georgios N.
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MULTIMODAL INTERACTION, ICMI 2022, 2022, : 531 - 539