Asymmetric patch sampling for contrastive learning

被引:1
|
作者
Shen, Chengchao [1 ]
Chen, Jianzhong [1 ]
Wang, Shu [1 ]
Kuang, Hulin [1 ]
Liu, Jin [1 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, 932 Lushan South Rd, Changsha 410083, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Contrastive learning; Unsupervised learning; Self-supervised learning; Representation learning; Deep learning;
D O I
10.1016/j.patcog.2024.111012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Asymmetric appearance between positive pair effectively reduces the risk of representation degradation in contrastive learning. However, there are still a mass of appearance similarities between positive pair constructed by the existing methods, thus inhibiting the further representation improvement. To address the above issue, we propose a novel asymmetric patch sampling strategy, which significantly reduces the appearance similarities but retains the image semantics. Specifically, dual patch sampling strategies are respectively applied to the given image. First, sparse patch sampling is conducted to obtain the first view, which reduces spatial redundancy of image and allows a more asymmetric view. Second, a selective patch sampling is proposed to construct another view with large appearance discrepancy relative to the first one. Due to the inappreciable appearance similarities between positive pair, the trained model is encouraged to capture the similarities on semantics, instead of low-level ones. Experimental results demonstrate that our method significantly outperforms the existing self-supervised learning methods on ImageNet-1K and CIFAR datasets, e.g., 2.5% finetuning accuracy improvement on CIFAR100. Furthermore, our method achieves state-of-the-art performance on downstream tasks, object detection and instance segmentation on COCO. Additionally, compared to other self-supervised methods, our method is more efficient on both memory and computation during pretraining.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Data efficient contrastive learning in histopathology using active sampling
    Reasat, Tahsin
    Sushmit, Asif
    Smith, David S.
    MACHINE LEARNING WITH APPLICATIONS, 2024, 17
  • [22] Contrastive Learning and Multi-Choice Negative Sampling Recommendation
    Xue, Yun
    Cai, Xiaodong
    Fang, Sheng
    Zhou, Li
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (05) : 905 - 912
  • [23] Contrastive Learning in Neural Tensor Networks using Asymmetric Examples
    Islam, Mohammad Maminur
    Sarkhel, Somdeb
    Venugopal, Deepak
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 28 - 39
  • [24] Contrastive Speaker Representation Learning with Hard Negative Sampling for Speaker Recognition
    Go, Changhwan
    Lee, Young Han
    Kim, Taewoo
    Park, Nam In
    Chun, Chanjun
    SENSORS, 2024, 24 (19)
  • [25] Hybrid sampling-based contrastive learning for imbalanced node classification
    Cui, Caixia
    Wang, Jie
    Wei, Wei
    Liang, Jiye
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (03) : 989 - 1001
  • [26] Hybrid sampling-based contrastive learning for imbalanced node classification
    Caixia Cui
    Jie Wang
    Wei Wei
    Jiye Liang
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 989 - 1001
  • [27] PSSL: Self-supervised Learning for Personalized Search with Contrastive Sampling
    Zhou, Yujia
    Dou, Zhicheng
    Zhu, Yutao
    Wen, Ji-Rong
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2749 - 2758
  • [28] Relation-dependent contrastive learning with cluster sampling for inductiverelation prediction
    Wu, Jianfeng
    Xiong, Aolin
    Mai, Sijie
    Hu, Haifeng
    NEUROCOMPUTING, 2024, 579
  • [29] Prediction of Graduation Development Based on Hypergraph Contrastive Learning With Imbalanced Sampling
    Ouyang, Yong
    Feng, Tuo
    Gao, Rong
    Xu, Yubin
    Liu, Jinghang
    IEEE ACCESS, 2023, 11 : 89881 - 89895
  • [30] EMCRL: EM-Enhanced Negative Sampling Strategy for Contrastive Representation Learning
    Zhang, Kun
    Lv, Guangyi
    Wu, Le
    Hong, Richang
    Wang, Meng
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024,