Boosting Generalized Few-Shot Learning by Scattering Intra-class Distribution

被引:0
|
作者
Yu, Yunlong [1 ]
Jin, Lisha [1 ]
Li, Yingming [1 ]
机构
[1] Zhejiang Univ, Hangzhou, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Generalized Few-Shot Learning; Scatter Intra-class Distribution; Feature Representation;
D O I
10.1007/978-3-031-43415-0_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generalized Few-Shot Learning (GFSL) applies the model trained with the base classes to predict the samples from both base classes and novel classes, where each novel class is only provided with a few labeled samples during testing. Limited by the severe data imbalance between base and novel classes, GFSL easily suffers from the prediction shift issue that most test samples tend to be classified into the base classes. Unlike the existing works that address this issue by either multi-stage training or complicated model design, we argue that extracting both discriminative and generalized feature representations is all GFSL needs, which could be achieved by simply scattering the intra-class distribution during training. Specifically, we introduce two self-supervised auxiliary tasks and a label permutation task to encourage the model to learn more image-level feature representations and push the decision boundary from novel towards base classes during inference. Our method is one-stage and could perform online inference. Experiments on the miniImageNet and tieredImageNet datasets show that the proposed method achieves comparable performance with the state-of-the-art multi-stage competitors under both traditional FSL and GFSL tasks, empirically proving that feature representation is the key for GFSL.
引用
收藏
页码:438 / 453
页数:16
相关论文
共 50 条
  • [31] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    Neural Networks, 2024, 169 : 307 - 324
  • [32] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    NEURAL NETWORKS, 2024, 169 : 307 - 324
  • [33] Graph Few-shot Class-incremental Learning
    Tan, Zhen
    Ding, Kaize
    Guo, Ruocheng
    Liu, Huan
    WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 987 - 996
  • [34] Constrained Few-shot Class-incremental Learning
    Hersche, Michael
    Karunaratne, Geethan
    Cherubini, Giovanni
    Benini, Luca
    Sebastian, Abu
    Rahimi, Abbas
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9047 - 9057
  • [35] Shot in the Dark: Few-Shot Learning with No Base-Class Labels
    Chen, Zitian
    Maji, Subhransu
    Learned-Miller, Erik
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2662 - 2671
  • [36] Critic Boosting Attention Network on Local Descriptor for Few-shot Learning
    Shi, Chengzhang
    Own, Chung-Ming
    Chou, Ching-chih
    Guo, Bailu
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [37] Adaptive attribute distribution similarity for few-shot learning
    Cai, Anping
    Chen, Leiting
    Chen, Yongqi
    He, Ziyu
    Tao, Shuqing
    Zhou, Chuan
    IMAGE AND VISION COMPUTING, 2024, 148
  • [38] ACTIVE CLASS SELECTION FOR FEW-SHOT CLASS-INCREMENTAL LEARNING
    McClurg, Christopher
    Ayub, Ali
    Tyagi, Harsh
    Rajtmajer, Sarah M.
    Wagner, Alan R.
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 232, 2023, 232 : 811 - 827
  • [39] Distributed few-shot learning with prototype distribution correction
    Fu, Zhiling
    Tang, Dongfang
    Ma, Pingchuan
    Wang, Zhe
    Gao, Wen
    APPLIED INTELLIGENCE, 2023, 53 (24) : 30552 - 30565
  • [40] Adaptive few-shot learning with a fair priori distribution
    Zeng, Xinke
    Huang, Bo
    Jia, Ke
    Jia, Li
    Zhao, Ke
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 102