Contrastive learning with semantic consistency constraint

被引:1
|
作者
Guo, Huijie [1 ]
Shi, Lei [1 ]
机构
[1] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing, Peoples R China
关键词
Representation learning; Contrastive learning; Semantic consistency;
D O I
10.1016/j.imavis.2023.104754
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Contrastive representation learning (CL) can be viewed as an anchor-based learning paradigm that learns rep-resentations by maximizing the similarity between an anchor and positive samples while reducing the similarity with negative samples. A randomly adopted data augmentation strategy generates positive and negative samples, resulting in semantic inconsistency in the learning process. The randomness may introduce additional distur-bances to the original sample, thereby reversing the sample identity. Also, the negative sample demarcation strategy makes the negative samples containing semantically similar samples to the anchors, called false negative samples. Therefore, CL's maximization and reduction process cause distractors to be incorporated into the learned feature representation. In this paper, we propose a novel Semantic Consistency Regularization (SCR) method to alleviate this problem. Specifically, we introduce a new regularization item, pairwise subspace dis-tance, to constrain the consistency of distributions across different views. Furthermore, we propose a divide-and-conquer strategy to ensure that the proposed SCR is well-suited for large mini-batch cases. Empirically, results across multiple benchmark mini and large datasets demonstrate that SCR outperforms state-of-the-art methods. Codes are available at https://github.com/PaulGHJ/SCR.git.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Federated Contrastive Learning for Personalized Semantic Communication
    Wang, Yining
    Ni, Wanli
    Yi, Wenqiang
    Xu, Xiaodong
    Zhang, Ping
    Nallanathan, Arumugam
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (08) : 1875 - 1879
  • [22] Contrastive Learning-Based Semantic Communications
    Tang, Shunpu
    Yang, Qianqian
    Fan, Lisheng
    Lei, Xianfu
    Nallanathan, Arumugam
    Karagiannidis, George K.
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2024, 72 (10) : 6328 - 6343
  • [23] Contrastive Learning for Label Efficient Semantic Segmentation
    Zhao, Xiangyun
    Vemulapalli, Raviteja
    Mansfield, Philip Andrew
    Gong, Boqing
    Green, Bradley
    Shapira, Lior
    Wu, Ying
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 10603 - 10613
  • [24] Leveraging deep contrastive learning for semantic interaction
    Belcaid M.
    Martinez A.G.
    Leigh J.
    PeerJ Computer Science, 2022, 8
  • [25] Leveraging deep contrastive learning for semantic interaction
    Belcaid, Mahdi
    Martinez, Alberto Gonzalez
    Leigh, Jason
    PEERJ COMPUTER SCIENCE, 2022, 8
  • [26] Interventional Contrastive Learning with Meta Semantic Regularizer
    Qiang, Wenwen
    Li, Jiangmeng
    Zheng, Changwen
    Su, Bing
    Xiong, Hui
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [27] Guided contrastive boundary learning for semantic segmentation
    Qiu, Shoumeng
    Chen, Jie
    Zhang, Haiqiang
    Wan, Ru
    Xue, Xiangyang
    Pu, Jian
    PATTERN RECOGNITION, 2024, 155
  • [28] Correction to: Learning Contrastive Representation for Semantic Correspondence
    Taihong Xiao
    Sifei Liu
    Shalini De Mello
    Zhiding Yu
    Jan Kautz
    Ming-Hsuan Yang
    International Journal of Computer Vision, 2022, 130 : 1607 - 1607
  • [29] Semantic consistency for graph representation learning
    Huang, Jincheng
    Li, Pin
    Zhang, Kai
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [30] Contrastive Cycle Consistency Learning for Unsupervised Visual Tracking
    Zhu, Jiajun
    Ma, Chao
    Jia, Shuai
    Xu, Shugong
    PATTERN RECOGNITION AND COMPUTER VISION, PT I, 2021, 13019 : 564 - 576