Asymmetric patch sampling for contrastive learning

被引:1
|
作者
Shen, Chengchao [1 ]
Chen, Jianzhong [1 ]
Wang, Shu [1 ]
Kuang, Hulin [1 ]
Liu, Jin [1 ]
Wang, Jianxin [1 ]
机构
[1] Cent South Univ, 932 Lushan South Rd, Changsha 410083, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Contrastive learning; Unsupervised learning; Self-supervised learning; Representation learning; Deep learning;
D O I
10.1016/j.patcog.2024.111012
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Asymmetric appearance between positive pair effectively reduces the risk of representation degradation in contrastive learning. However, there are still a mass of appearance similarities between positive pair constructed by the existing methods, thus inhibiting the further representation improvement. To address the above issue, we propose a novel asymmetric patch sampling strategy, which significantly reduces the appearance similarities but retains the image semantics. Specifically, dual patch sampling strategies are respectively applied to the given image. First, sparse patch sampling is conducted to obtain the first view, which reduces spatial redundancy of image and allows a more asymmetric view. Second, a selective patch sampling is proposed to construct another view with large appearance discrepancy relative to the first one. Due to the inappreciable appearance similarities between positive pair, the trained model is encouraged to capture the similarities on semantics, instead of low-level ones. Experimental results demonstrate that our method significantly outperforms the existing self-supervised learning methods on ImageNet-1K and CIFAR datasets, e.g., 2.5% finetuning accuracy improvement on CIFAR100. Furthermore, our method achieves state-of-the-art performance on downstream tasks, object detection and instance segmentation on COCO. Additionally, compared to other self-supervised methods, our method is more efficient on both memory and computation during pretraining.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Asymmetric slack contrastive learning for full use of feature information in image translation
    Zhang, Yusen
    Li, Min
    Gou, Yao
    He, Yujie
    KNOWLEDGE-BASED SYSTEMS, 2024, 299
  • [42] Unsupervised Deraining: Where Asymmetric Contrastive Learning Meets Self-Similarity
    Chang, Yi
    Guo, Yun
    Ye, Yuntong
    Yu, Changfeng
    Zhu, Lin
    Zhao, Xile
    Yan, Luxin
    Tian, Yonghong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2638 - 2657
  • [43] Self-Contrastive Learning with Hard Negative Sampling for Self-supervised Point Cloud Learning
    Du, Bi'an
    Gao, Xiang
    Hu, Wei
    Li, Xin
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 3133 - 3142
  • [44] Patch-Mixing Contrastive Regularization for Few-Label Semi-Supervised Learning
    Hu X.
    Xu X.
    Zeng Y.
    Yang X.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (01): : 384 - 397
  • [45] Revisiting Multimodal Representation in Contrastive Learning: From Patch and Token Embeddings to Finite Discrete Tokens
    Chen, Yuxiao
    Yuan, Jianbo
    Tian, Yu
    Geng, Shijie
    Li, Xinyu
    Zhou, Ding
    Metaxas, Dimitris N.
    Yang, Hongxia
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 15095 - 15104
  • [46] Contrastive Dual Gating: Learning Sparse Features With Contrastive Learning
    Meng, Jian
    Yang, Li
    Shin, Jinwoo
    Fan, Deliang
    Seo, Jae-Sun
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12247 - 12255
  • [47] Self-Supervised Learning With Learnable Sparse Contrastive Sampling for Hyperspectral Image Classification
    Liang, Miaomiao
    Dong, Jian
    Yu, Lingjuan
    Yu, Xiangchun
    Meng, Zhe
    Jiao, Licheng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61 : 1 - 13
  • [48] Domain generalization by class-aware negative sampling-based contrastive learning
    Xie, Mengwei
    Zhao, Suyun
    Chen, Hong
    Li, Cuiping
    AI OPEN, 2022, 3 : 200 - 207
  • [49] Improving Contrastive Learning on Imbalanced Seed Data via Open-World Sampling
    Jiang, Ziyu
    Chen, Tianlong
    Chen, Ting
    Wang, Zhangyang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [50] Improved Patch-Mix Transformer and Contrastive Learning Method for Sound Classification in Noisy Environments
    Chen, Xu
    Wang, Mei
    Kan, Ruixiang
    Qiu, Hongbing
    APPLIED SCIENCES-BASEL, 2024, 14 (21):