Self-supervised pairwise-sample resistance model for few-shot classification

被引:0
|
作者
Weigang Li
Lu Xie
Ping Gan
Yuntao Zhao
机构
[1] Wuhan University of Science and Technology,Engineering Research Center of Metallurgical Automation and Measurement Technology of Ministry of Education
[2] Wuhan University of Science and Technology,College of Information Science and Engineering
[3] Qunar,undefined
来源
Applied Intelligence | 2023年 / 53卷
关键词
Few-shot; Self-supervised; Meta-learning; Pairwise-sample; Regularization;
D O I
暂无
中图分类号
学科分类号
摘要
The traditional supervised learning models rely on high-quality labeled samples heavily. In many fields, training the model on limited labeled samples will result in a weak generalization ability of the model. To address this problem, we propose a novel few-shot image classification method by self-supervised and metric learning, which contains two training steps: (1) Training the feature extractor and projection head with strong representational ability by self-supervised technology; (2) taking the trained feature extractor and projection head as the initialization meta-learning model, and fine-tuning the meta-learning model by the proposed loss functions. Specifically, we construct the pairwise-sample meta loss (ML) to consider the influence of each sample on the target sample in the feature space, and propose a novel regularization technique named resistance regularization based on pairwise-samples which is utilized as an auxiliary loss in the meta-learning model. The model performance is evaluated on the 5-way 1-shot and 5-way 5-shot classification tasks of mini-ImageNet and tired-ImageNet. The results demonstrate that the proposed method achieves the state-of-the-art performance.
引用
收藏
页码:20661 / 20674
页数:13
相关论文
共 50 条
  • [21] Few-shot image classification with composite rotation based self-supervised auxiliary task
    Mazumder, Pratik
    Singh, Pravendra
    Namboodiri, Vinay P.
    [J]. NEUROCOMPUTING, 2022, 489 : 179 - 195
  • [22] Self-Supervised SpectralSpatial Graph Prototypical Network for Few-Shot Hyperspectral Image Classification
    Ma, Shan
    Tong, Lei
    Zhou, Jun
    Yu, Jing
    Xiao, Chuangbai
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [23] Collaborative Self-Supervised Transductive Few-Shot Learning for Remote Sensing Scene Classification
    Han, Haiyan
    Huang, Yangchao
    Wang, Zhe
    [J]. ELECTRONICS, 2023, 12 (18)
  • [24] Self-Supervised Meta-Learning for Few-Shot Natural Language Classification Tasks
    Bansal, Trapit
    Jha, Rishikesh
    Munkhdalai, Tsendsuren
    McCallum, Andrew
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 522 - 534
  • [25] Multiform Ensemble Self-Supervised Learning for Few-Shot Remote Sensing Scene Classification
    Li, Jianzhao
    Gong, Maoguo
    Liu, Huilin
    Zhang, Yourun
    Zhang, Mingyang
    Wu, Yue
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [26] Self-supervised Prototype Conditional Few-Shot Object Detection
    Kobayashi, Daisuke
    [J]. IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT II, 2022, 13232 : 681 - 692
  • [27] Self-Supervised Learning for Few-Shot Medical Image Segmentation
    Ouyang, Cheng
    Biffi, Carlo
    Chen, Chen
    Kart, Turkay
    Qiu, Huaqi
    Rueckert, Daniel
    [J]. IEEE TRANSACTIONS ON MEDICAL IMAGING, 2022, 41 (07) : 1837 - 1848
  • [28] Self-Supervised Approach for Few-shot Hand Gesture Recognition
    Kimura, Naoki
    [J]. ADJUNCT PROCEEDINGS OF THE 35TH ACM SYMPOSIUM ON USER INTERFACE SOFTWARE & TECHNOLOGY, UIST 2022, 2022,
  • [29] Self-Supervised Task Augmentation for Few-Shot Intent Detection
    Peng-Fei Sun
    Ya-Wen Ouyang
    Ding-Jie Song
    Xin-Yu Dai
    [J]. Journal of Computer Science and Technology, 2022, 37 : 527 - 538
  • [30] Multi-task Self-supervised Few-Shot Detection
    Zhang, Guangyong
    Duan, Lijuan
    Wang, Wenjian
    Gong, Zhi
    Ma, Bian
    [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT XII, 2024, 14436 : 107 - 119