Self-Attention Message Passing for Contrastive Few-Shot Learning

被引:0
|
作者
Shirekar, Ojas Kishorkumar [1 ,2 ]
Singh, Anuj [1 ,2 ]
Jamali-Rad, Hadi [1 ,2 ]
机构
[1] Delft Univ Technol, Delft, Netherlands
[2] Shell Global Solut Int BV, Amsterdam, Netherlands
关键词
D O I
10.1109/WACV56688.2023.00539
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Humans have a unique ability to learn new representations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a satisfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural networks (GNNs) in discovering complex inter-sample relationships, we propose a novel self-attention based message passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. We also propose an optimal transport (OT) based fine-tuning strategy (we call OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification framework (SAMPTransfer). Our extensive experimental results corroborate the efficacy of SAMPTransfer in a variety of downstream few-shot classification scenarios, setting a new state-of-the-art for U-FSL on both miniImageNet and tieredImageNet benchmarks, offering up to 7%+ and 5%+ improvements, respectively. Our further investigations also confirm that SAMPTransfer remains on-par with some supervised baselines on miniImageNet and outperforms all existing U-FSL baselines in a challenging cross-domain scenario. Our code can be found in our GitHub repository: https://github.com/ojss/SAMPTransfer/.
引用
收藏
页码:5415 / 5425
页数:11
相关论文
共 50 条
  • [11] Correction to: Self-attention network for few-shot learning based on nearest-neighbor algorithm
    Guangpeng Wang
    Yongxiong Wang
    Machine Vision and Applications, 2023, 34
  • [12] Interpretable Few-Shot Learning with Contrastive Constraint
    Zhang L.
    Chen Y.
    Wu W.
    Wei B.
    Luo X.
    Chang X.
    Liu J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2021, 58 (12): : 2573 - 2584
  • [13] Diversified Contrastive Learning For Few-Shot Classification
    Lu, Guangtong
    Li, Fanzhang
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT I, 2023, 14254 : 147 - 158
  • [14] Spatial Contrastive Learning for Few-Shot Classification
    Ouali, Yassine
    Hudelot, Celine
    Tami, Myriam
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, 2021, 12975 : 671 - 686
  • [15] Learning a Few-shot Embedding Model with Contrastive Learning
    Liu, Chen
    Fu, Yanwei
    Xu, Chengming
    Yang, Siqian
    Li, Jilin
    Wang, Chengjie
    Zhang, Li
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8635 - 8643
  • [16] Augmenting Few-Shot Learning With Supervised Contrastive Learning
    Lee, Taemin
    Yoo, Sungjoo
    IEEE ACCESS, 2021, 9 : 61466 - 61474
  • [17] Attention-Based Contrastive Learning for Few-Shot Remote Sensing Image Classification
    Xu, Yulong
    Bi, Hanbo
    Yu, Hongfeng
    Lu, Wanxuan
    Li, Peifeng
    Li, Xinming
    Sun, Xian
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [18] SCL: Self-supervised contrastive learning for few-shot image classification
    Lim, Jit Yan
    Lim, Kian Ming
    Lee, Chin Poo
    Tan, Yong Xuan
    NEURAL NETWORKS, 2023, 165 : 19 - 30
  • [19] Multimodal variational contrastive learning for few-shot classification
    Pan, Meihong
    Shen, Hongbin
    APPLIED INTELLIGENCE, 2024, 54 (02) : 1879 - 1892
  • [20] Few-shot image generation with reverse contrastive learning
    Gou, Yao
    Li, Min
    Zhang, Yusen
    He, Zhuzhen
    He, Yujie
    NEURAL NETWORKS, 2024, 169 : 154 - 164