Few-shot learning based on prototype rectification with a self-attention mechanism

被引:2
|
作者
Zhao, Peng [1 ]
Wang, Liang
Zhao, Xuyang
Liu, Huiting
Ji, Xia
机构
[1] Anhui Univ, Key Lab Intelligent Comp & Signal Proc, Minist Educ, Hefei 230601, Peoples R China
基金
中国国家自然科学基金;
关键词
Few-shot learning; Metric learning; Meta-learning; Self-attention mechanism;
D O I
10.1016/j.eswa.2024.123586
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Few -shot learning (FSL) is a challenging problem. Prototype -based methods are simple and effective methods for addressing few -shot learning. Due to the lack of labeled samples, the learned class prototype in the existing prototype -based few -shot learning methods has a great deviation and cannot express the representative and discriminant characteristics of its corresponding class well. To address this problem, in this work we propose few -shot learning based on prototype rectification with a self -attention mechanism(FSL-PRS). To learn more unbiased and discriminative class prototypes, FSL-PRS takes the support set and the query set as a whole and learns task -related features from the features extracted from pretrained backbone networks with a self -attention mechanism. Then, the learned task -related features are utilized to compute the original class prototypes and predict a pseudo label and confidence for each query sample. The query samples with high confidence are incorporated into the support set to rectify the class prototypes. We hope that the learned class prototype can better highlight the class significance. Therefore, a class significance learning module is designed for making the learned class prototypes more discriminative. Different from prior works, we take the support set and the query set as a whole to learn task -related features with a self -attention mechanism, which not only alleviates the negative effect of distribution differences between the support set and query set but also fuses global context information to enhance features for FSL. We conduct comprehensive experiments on four benchmark datasets widely adopted in few -shot learning. The experimental results demonstrate that the FSL-PRS achieves state-of-the-art performance, which validates its effectiveness.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Few-Shot Relation Prediction of Knowledge Graph via Convolutional Neural Network with Self-Attention
    Shanna Zhong
    Jiahui Wang
    Kun Yue
    Liang Duan
    Zhengbao Sun
    Yan Fang
    [J]. Data Science and Engineering, 2023, 8 (4) : 385 - 395
  • [42] Vision Transformer With Enhanced Self-Attention for Few-Shot Ship Target Recognition in Complex Environments
    Tian, Yang
    Meng, Hao
    Yuan, Fei
    Ling, Yue
    Yuan, Ningze
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [43] Optimization model based on attention mechanism for few-shot image classification
    Liao, Ruizhi
    Zhai, Junhai
    Zhang, Feng
    [J]. MACHINE VISION AND APPLICATIONS, 2024, 35 (02)
  • [44] Optimization model based on attention mechanism for few-shot image classification
    Ruizhi Liao
    Junhai Zhai
    Feng Zhang
    [J]. Machine Vision and Applications, 2024, 35
  • [45] Few-Shot Learning Based on Dimensionally Enhanced Attention and Logit Standardization Self-Distillation
    Tang, Yuhong
    Li, Guang
    Zhang, Ming
    Li, Jianjun
    [J]. ELECTRONICS, 2024, 13 (15)
  • [46] Anomaly Detection Using Siamese Network with Attention Mechanism for Few-Shot Learning
    Takimoto, Hironori
    Seki, Junya
    Situju, Sulfayanti F.
    Kanagawa, Akihiro
    [J]. APPLIED ARTIFICIAL INTELLIGENCE, 2022, 36 (01)
  • [47] Few-Shot Learning Based on Double Pooling Squeeze and Excitation Attention
    Xu, Qiuyu
    Su, Jie
    Wang, Ying
    Zhang, Jing
    Zhong, Yixin
    [J]. ELECTRONICS, 2023, 12 (01)
  • [48] Few-shot intent detection with self-supervised pretraining and prototype-aware attention
    Yang, Shun
    Du, YaJun
    Zheng, Xin
    Li, XianYong
    Chen, XiaoLiang
    Li, YanLi
    Xie, ChunZhi
    [J]. PATTERN RECOGNITION, 2024, 155
  • [49] META-LEARNING WITH ATTENTION FOR IMPROVED FEW-SHOT LEARNING
    Hou, Zejiang
    Walid, Anwar
    Kung, Sun-Yuan
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 2725 - 2729
  • [50] PROMPT PROTOTYPE LEARNING BASED ON RANKING INSTRUCTION FOR FEW-SHOT VISUAL TASKS
    Sun, Li
    Wang, Liuan
    Sun, Jun
    Okatani, Takayuki
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 3235 - 3239