Flexible few-shot class-incremental learning with prototype container

被引:2
|
作者
Xu, Xinlei [1 ,2 ]
Wang, Zhe [1 ,2 ]
Fu, Zhiling [1 ,2 ]
Guo, Wei [1 ,2 ]
Chi, Ziqiu [1 ,2 ]
Li, Dongdong [1 ,2 ]
机构
[1] East China Univ Sci & Technol, Key Lab Smart Mfg Energy Chem Proc, Minist Educ, Shanghai 200237, Peoples R China
[2] East China Univ Sci & Technol, Dept Comp Sci & Engn, Shanghai 200237, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2023年 / 35卷 / 15期
关键词
Few-shot class-incremental learning; Few shot learning; Incremental learning; INFORMATION;
D O I
10.1007/s00521-023-08272-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the few-shot class- incremental learning, new class samples are utilized to learn the characteristics of new classes, while old class exemplars are used to avoid old knowledge forgetting. The limited number of new class samples is more likely to cause overfitting during incremental training. Moreover, mass stored old exemplars mean large storage space consumption. To solve the above difficulties, in this paper we propose a novel flexible few-shot class-incremental framework to make the incremental process efficient and convenient. We enhance the expression ability of extracted features through multistage pre-training. Then, we set up a prototype container to store each class prototype to retain old knowledge. When new classes flow in, we calculate the new class prototypes and update the prototype container. Finally, we get the prediction result through similarity weighting. The entire framework only need to train the base class classifier and does not require further training during the incremental process. It avoids the overfitting of novel classes and saves time for further training. Besides, storing prototypes can save more storage space than original image data. Overall, the entire framework has the advantage of flexibility. We conduct extensive experiments on three standard few-shot class-incremental datasets and achieve state-of-the-art results. Especially, to verify the flexibility of the framework, we discuss the special federated fewshot class-incremental scenarios in addition. No further training and less storage consumption provide the possibility for applications in more complex scenarios.
引用
收藏
页码:10875 / 10889
页数:15
相关论文
共 50 条
  • [41] Learning to complement: Relation complementation network for few-shot class-incremental learning
    Wang, Ye
    Wang, Yaxiong
    Zhao, Guoshuai
    Qian, Xueming
    KNOWLEDGE-BASED SYSTEMS, 2023, 282
  • [42] Overcomplete-to-sparse representation learning for few-shot class-incremental learning
    Fu Mengying
    Liu Binghao
    Ma Tianren
    Ye Qixiang
    Multimedia Systems, 2024, 30
  • [43] CLOSER: Towards Better Representation Learning for Few-Shot Class-Incremental Learning
    Oh, Junghun
    Baik, Sungyong
    Lee, Kyoung Mu
    COMPUTER VISION - ECCV 2024, PT XLIX, 2025, 15107 : 18 - 35
  • [44] Few-Shot Class-Incremental Learning via Class-Aware Bilateral Distillation
    Zhao, Linglan
    Lu, Jing
    Xu, Yunlu
    Cheng, Zhanzhan
    Guo, Dashan
    Niu, Yi
    Fang, Xiangzhong
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11838 - 11847
  • [45] Language-Inspired Relation Transfer for Few-Shot Class-Incremental Learning
    Zhao, Yifan
    Li, Jia
    Song, Zeyin
    Tian, Yonghong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (02) : 1089 - 1102
  • [46] Synthesized Feature based Few-Shot Class-Incremental Learning on a Mixture of Subspaces
    Cheraghian, Ali
    Rahman, Shafin
    Ramasinghe, Sameera
    Fang, Pengfei
    Simon, Christian
    Petersson, Lars
    Harandi, Mehrtash
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 8641 - 8650
  • [47] Semantic-visual Guided Transformer for Few-shot Class-incremental Learning
    Qiu, Wenhao
    Fu, Sichao
    Zhang, Jingyi
    Lei, Chengxiang
    Peng, Qinmu
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 2885 - 2890
  • [48] Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning
    Cheraghian, Ali
    Rahman, Shafin
    Fang, Pengfei
    Roy, Soumava Kumar
    Petersson, Lars
    Harandi, Mehrtash
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 2534 - 2543
  • [49] Few-Shot Class-Incremental Learning by Sampling Multi-Phase Tasks
    Zhou, Da-Wei
    Ye, Han-Jia
    Ma, Liang
    Xie, Di
    Pu, Shiliang
    Zhan, De-Chuan
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 12816 - 12831
  • [50] DyCR: A Dynamic Clustering and Recovering Network for Few-Shot Class-Incremental Learning
    Pan, Zicheng
    Yu, Xiaohan
    Zhang, Miaohua
    Zhang, Weichuan
    Gao, Yongsheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14