SELF-ATTENTION RELATION NETWORK FOR FEW-SHOT LEARNING

被引:34
|
作者
Hui, Binyuan [1 ]
Zhu, Pengfei [1 ]
Hu, Qinghua [1 ]
Wang, Qilong [1 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin, Peoples R China
关键词
Few-shot Learning; Zero-shot Learning; Self-Attention Module;
D O I
10.1109/ICMEW.2019.00041
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The success of deep learning greatly attributes to massive data with accurate labels. However, for few shot learning, especially zero shot learning, deep models cannot be well trained in that there are few available labeled samples. Inspired by human visual system, attention models have been widely used in action recognition, instance segmentation, and other vision tasks by introducing spatial, temporal, or channel-wise weights. In this paper, we propose a self-attention relation network (SARN) for few-shot learning. SARN consists of three modules, i.e., embedding module, attention module and relation module. The embedding module extracts feature maps while the attention module is introduced to enhance the learned features. Finally the extracted features of the query sample and support set are fed into the relation module for comparison, and the relation score is output for classification. Compared with the existing relation network for few shot learning, SARN can discover non-local information and allow long-range dependency. SARN can be easily extended to zero shot learning by replacing the support set with semantic vectors. Experiments on benchmarks (Omniglot, miniIm-ageNet, AwA, and CUB) show that our proposed SARN outperforms the state-of-the-art algorithms in terms of both few shot and zero shot tasks.
引用
收藏
页码:198 / 203
页数:6
相关论文
共 50 条
  • [1] SaberNet: Self-attention based effective relation network for few-shot learning
    Li, Zijun
    Hu, Zhengping
    Luo, Weiwei
    Hu, Xiao
    [J]. PATTERN RECOGNITION, 2023, 133
  • [2] SAPENet: Self-Attention based Prototype Enhancement Network for Few-shot Learning
    Huang, Xilang
    Choi, Seon Han
    [J]. PATTERN RECOGNITION, 2023, 135
  • [3] Self-Attention Message Passing for Contrastive Few-Shot Learning
    Shirekar, Ojas Kishorkumar
    Singh, Anuj
    Jamali-Rad, Hadi
    [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 5415 - 5425
  • [4] Self-attention network for few-shot learning based on nearest-neighbor algorithm
    Wang, Guangpeng
    Wang, Yongxiong
    [J]. MACHINE VISION AND APPLICATIONS, 2023, 34 (02)
  • [5] Few-Shot Relation Prediction of Knowledge Graph via Convolutional Neural Network with Self-Attention
    Zhong, Shanna
    Wang, Jiahui
    Yue, Kun
    Duan, Liang
    Sun, Zhengbao
    Fang, Yan
    [J]. DATA SCIENCE AND ENGINEERING, 2023, 8 (04) : 385 - 395
  • [6] Few-Shot Relation Prediction of Knowledge Graph via Convolutional Neural Network with Self-Attention
    Shanna Zhong
    Jiahui Wang
    Kun Yue
    Liang Duan
    Zhengbao Sun
    Yan Fang
    [J]. Data Science and Engineering, 2023, 8 (4) : 385 - 395
  • [7] Few-shot learning based on prototype rectification with a self-attention mechanism
    Zhao, Peng
    Wang, Liang
    Zhao, Xuyang
    Liu, Huiting
    Ji, Xia
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [8] Few-Shot Learning Based on Self-Attention and Auto-Encoder
    Ji, Zhong
    Chai, Xingliang
    [J]. Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/Journal of Tianjin University Science and Technology, 2021, 54 (04): : 338 - 345
  • [9] Correction to: Self-attention network for few-shot learning based on nearest-neighbor algorithm
    Guangpeng Wang
    Yongxiong Wang
    [J]. Machine Vision and Applications, 2023, 34
  • [10] Spatial Attention Network for Few-Shot Learning
    He, Xianhao
    Qiao, Peng
    Dou, Yong
    Niu, Xin
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 567 - 578