Self-Attention Message Passing for Contrastive Few-Shot Learning

被引:0
|
作者
Shirekar, Ojas Kishorkumar [1 ,2 ]
Singh, Anuj [1 ,2 ]
Jamali-Rad, Hadi [1 ,2 ]
机构
[1] Delft Univ Technol, Delft, Netherlands
[2] Shell Global Solut Int BV, Amsterdam, Netherlands
关键词
D O I
10.1109/WACV56688.2023.00539
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Humans have a unique ability to learn new representations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a satisfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural networks (GNNs) in discovering complex inter-sample relationships, we propose a novel self-attention based message passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. We also propose an optimal transport (OT) based fine-tuning strategy (we call OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification framework (SAMPTransfer). Our extensive experimental results corroborate the efficacy of SAMPTransfer in a variety of downstream few-shot classification scenarios, setting a new state-of-the-art for U-FSL on both miniImageNet and tieredImageNet benchmarks, offering up to 7%+ and 5%+ improvements, respectively. Our further investigations also confirm that SAMPTransfer remains on-par with some supervised baselines on miniImageNet and outperforms all existing U-FSL baselines in a challenging cross-domain scenario. Our code can be found in our GitHub repository: https://github.com/ojss/SAMPTransfer/.
引用
收藏
页码:5415 / 5425
页数:11
相关论文
共 50 条
  • [21] Supervised Contrastive Learning for Few-Shot Action Classification
    Han, Hongfeng
    Fei, Nanyi
    Lu, Zhiwu
    Wen, Ji-Rong
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 512 - 528
  • [22] Self-Supervised and Few-Shot Contrastive Learning Frameworks for Text Clustering
    Shi, Haoxiang
    Sakai, Tetsuya
    IEEE ACCESS, 2023, 11 : 84134 - 84143
  • [23] Multimodal variational contrastive learning for few-shot classification
    Meihong Pan
    Hongbin Shen
    Applied Intelligence, 2024, 54 : 1879 - 1892
  • [24] Few-shot Object Detection with Refined Contrastive Learning
    Shangguan, Zeyu
    Huai, Lian
    Liu, Tong
    Jiang, Xingqun
    2023 IEEE 35TH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, ICTAI, 2023, : 991 - 996
  • [25] Convolutional Self-attention Guided Graph Neural Network for Few-Shot Action Recognition
    Pan, Fei
    Guo, Jie
    Guo, Yanwen
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, ICIC 2023, PT II, 2023, 14087 : 401 - 412
  • [26] Spatial Attention Network for Few-Shot Learning
    He, Xianhao
    Qiao, Peng
    Dou, Yong
    Niu, Xin
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 567 - 578
  • [27] Reinforced Attention for Few-Shot Learning and Beyond
    Hong, Jie
    Fang, Pengfei
    Li, Weihao
    Zhang, Tong
    Simon, Christian
    Harandi, Mehrtash
    Petersson, Lars
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 913 - 923
  • [28] Attention Relational Network for Few-Shot Learning
    Shuai, Jia
    Chen, JiaMing
    Yang, Meng
    INTELLIGENCE SCIENCE AND BIG DATA ENGINEERING: BIG DATA AND MACHINE LEARNING, PT II, 2019, 11936 : 163 - 174
  • [29] A CONTRASTIVE SELF-SUPERVISED LEARNING SCHEME FOR BEAT TRACKING AMENABLE TO FEW-SHOT LEARNING
    Gagnere, Antonin
    Essid, Slim
    Peeters, Geoffroy
    arXiv,
  • [30] Unsupervised prototype self-calibration based on hybrid attention contrastive learning for enhanced few-shot action recognition
    An, Yiyuan
    Yi, Yingmin
    Wu, Li
    Cao, Yuan
    Zhou, Dingsong
    Yuan, Yiwei
    Liu, Bojun
    Xue, Xianghong
    Li, Yankai
    Su, Chunyi
    APPLIED SOFT COMPUTING, 2025, 168