DyCR: A Dynamic Clustering and Recovering Network for Few-Shot Class-Incremental Learning

被引:0
|
作者
Pan, Zicheng [1 ]
Yu, Xiaohan [1 ,2 ]
Zhang, Miaohua [1 ,3 ]
Zhang, Weichuan [1 ,4 ]
Gao, Yongsheng [1 ]
机构
[1] Griffith Univ, Inst Integrated & Intelligent Syst, Brisbane, Qld 4111, Australia
[2] Macquarie Univ, Sch Comp, Sydney, NSW 2109, Australia
[3] CSIRO, Efficient Comp Vis Team, Black Mt Site, Canberra, ACT 2601, Australia
[4] Shaanxi Univ Sci & Technol, Sch Elect Informat & Artificial Intelligence, Xian 710026, Peoples R China
基金
澳大利亚研究理事会;
关键词
Dynamic network; embedding space optimization; feature reconstruction; few-shot class-incremental learning (FSCIL); orthogonal decomposition;
D O I
10.1109/TNNLS.2024.3394844
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Few-shot class-incremental learning (FSCIL) aims to continually learn novel data with limited samples. One of the major challenges is the catastrophic forgetting problem of old knowledge while training the model on new data. To alleviate this problem, recent state-of-the-art methods adopt a well-trained static network with fixed parameters at incremental learning stages to maintain old knowledge. These methods suffer from the poor adaptation of the old model with new knowledge. In this work, a dynamic clustering and recovering network (DyCR) is proposed to tackle the adaptation problem and effectively mitigate the forgetting phenomena on FSCIL tasks. Unlike static FSCIL methods, the proposed DyCR network is dynamic and trainable during the incremental learning stages, which makes the network capable of learning new features and better adapting to novel data. To address the forgetting problem and improve the model performance, a novel orthogonal decomposition mechanism is developed to split the feature embeddings into context and category information. The context part is preserved and utilized to recover old class features in future incremental learning stages, which can mitigate the forgetting problem with a much smaller size of data than saving the raw exemplars. The category part is used to optimize the feature embedding space by moving different classes of samples far apart and squeezing the sample distances within the same classes during the training stage. Experiments show that the DyCR network outperforms existing methods on four benchmark datasets. The code is available at: https://github.com/zichengpan/DyCR.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Memorizing Complementation Network for Few-Shot Class-Incremental Learning
    Ji, Zhong
    Hou, Zhishen
    Liu, Xiyao
    Pang, Yanwei
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 937 - 948
  • [2] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    [J]. Neural Networks, 2024, 169 : 307 - 324
  • [3] A survey on few-shot class-incremental learning
    Tian, Songsong
    Li, Lusi
    Li, Weijun
    Ran, Hang
    Ning, Xin
    Tiwari, Prayag
    [J]. NEURAL NETWORKS, 2024, 169 : 307 - 324
  • [4] Graph Few-shot Class-incremental Learning
    Tan, Zhen
    Ding, Kaize
    Guo, Ruocheng
    Liu, Huan
    [J]. WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 987 - 996
  • [5] Constrained Few-shot Class-incremental Learning
    Hersche, Michael
    Karunaratne, Geethan
    Cherubini, Giovanni
    Benini, Luca
    Sebastian, Abu
    Rahimi, Abbas
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9047 - 9057
  • [6] A Few-Shot Class-Incremental Learning Method for Network Intrusion Detection
    Du, Lei
    Gu, Zhaoquan
    Wang, Ye
    Wang, Le
    Jia, Yan
    [J]. IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2389 - 2401
  • [7] ACTIVE CLASS SELECTION FOR FEW-SHOT CLASS-INCREMENTAL LEARNING
    McClurg, Christopher
    Ayub, Ali
    Tyagi, Harsh
    Rajtmajer, Sarah M.
    Wagner, Alan R.
    [J]. CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 232, 2023, 232 : 811 - 827
  • [8] Forward Compatible Few-Shot Class-Incremental Learning
    Zhou, Da-Wei
    Wang, Fu-Yun
    Ye, Han-Jia
    Ma, Liang
    Pu, Shiliang
    Zhan, De-Chuan
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9036 - 9046
  • [9] Learning to complement: Relation complementation network for few-shot class-incremental learning
    Wang, Ye
    Wang, Yaxiong
    Zhao, Guoshuai
    Qian, Xueming
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 282
  • [10] Filter Bank Networks for Few-Shot Class-Incremental Learning
    Zhou, Yanzhao
    Liu, Binghao
    Liu, Yiran
    Jiao, Jianbin
    [J]. CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 137 (01): : 647 - 668