Contrastive learning with semantic consistency constraint

被引:1
|
作者
Guo, Huijie [1 ]
Shi, Lei [1 ]
机构
[1] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing, Peoples R China
关键词
Representation learning; Contrastive learning; Semantic consistency;
D O I
10.1016/j.imavis.2023.104754
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Contrastive representation learning (CL) can be viewed as an anchor-based learning paradigm that learns rep-resentations by maximizing the similarity between an anchor and positive samples while reducing the similarity with negative samples. A randomly adopted data augmentation strategy generates positive and negative samples, resulting in semantic inconsistency in the learning process. The randomness may introduce additional distur-bances to the original sample, thereby reversing the sample identity. Also, the negative sample demarcation strategy makes the negative samples containing semantically similar samples to the anchors, called false negative samples. Therefore, CL's maximization and reduction process cause distractors to be incorporated into the learned feature representation. In this paper, we propose a novel Semantic Consistency Regularization (SCR) method to alleviate this problem. Specifically, we introduce a new regularization item, pairwise subspace dis-tance, to constrain the consistency of distributions across different views. Furthermore, we propose a divide-and-conquer strategy to ensure that the proposed SCR is well-suited for large mini-batch cases. Empirically, results across multiple benchmark mini and large datasets demonstrate that SCR outperforms state-of-the-art methods. Codes are available at https://github.com/PaulGHJ/SCR.git.
引用
收藏
页数:9
相关论文
共 50 条
  • [11] Learning Contrastive Representation for Semantic Correspondence
    Taihong Xiao
    Sifei Liu
    Shalini De Mello
    Zhiding Yu
    Jan Kautz
    Ming-Hsuan Yang
    International Journal of Computer Vision, 2022, 130 : 1293 - 1309
  • [12] Sequential and Dynamic constraint Contrastive Learning for Reinforcement Learning
    Shen, Weijie
    Yuan, Lei
    Huang, Junfu
    Gao, Songyi
    Huang, Yuyang
    Yu, Yang
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [13] Regularized Contrastive Learning of Semantic Search
    Tan, Mingxi
    Rolland, Alexis
    Tian, Andong
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT I, 2022, 13551 : 119 - 130
  • [14] Learning Contrastive Representation for Semantic Correspondence
    Xiao, Taihong
    Liu, Sifei
    De Mello, Shalini
    Yu, Zhiding
    Kautz, Jan
    Yang, Ming-Hsuan
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2022, 130 (05) : 1293 - 1309
  • [15] Hierarchical Contrastive Learning for Semantic Segmentation
    Jiang, Jie
    He, Xingjian
    Wang, Weining
    Lu, Hanqing
    Liu, Jing
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [16] Learning with Fantasy: Semantic-Aware Virtual Contrastive Constraint for Few-Shot Class-Incremental Learning
    Song, Zeyin
    Zhao, Yifan
    Shi, Yujun
    Peng, Peixi
    Yuan, Li
    Tian, Yonghong
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24183 - 24192
  • [17] Improving Augmentation Consistency for Graph Contrastive Learning
    Bu, Weixin
    Cao, Xiaofeng
    Zheng, Yizhen
    Pan, Shirui
    PATTERN RECOGNITION, 2024, 148
  • [18] Curriculum Consistency Learning and Multi-Scale Contrastive Constraint in Semi-Supervised Medical Image Segmentation
    Ding, Weizhen
    Li, Zhen
    BIOENGINEERING-BASEL, 2024, 11 (01):
  • [19] Strengthen contrastive semantic consistency for fine-grained image classification
    Wang, Yupeng
    Wang, Yongli
    Ye, Qiaolin
    Lang, Wenxi
    Xu, Can
    PATTERN ANALYSIS AND APPLICATIONS, 2025, 28 (02)
  • [20] SEMANTIC-ENHANCED SUPERVISED CONTRASTIVE LEARNING
    Zhang, Pingyue
    Wu, Mengyue
    Yu, Kai
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 6030 - 6034