Memory Efficient Class-Incremental Learning for Image Classification

被引:38
|
作者
Zhao, Hanbin [1 ]
Wang, Hui [1 ]
Fu, Yongjian [1 ]
Wu, Fei [1 ]
Li, Xi [1 ,2 ]
机构
[1] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou 310027, Peoples R China
[2] Zhejiang Univ, Shanghai Inst Adv Study, Shanghai 201210, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Knowledge transfer; Data mining; Adaptation models; Training; Noise measurement; Knowledge engineering; Catastrophic forgetting; class-incremental learning (CIL); classification; exemplar; memory efficient;
D O I
10.1109/TNNLS.2021.3072041
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the memory-resource-limited constraints, class-incremental learning (CIL) usually suffers from the ``catastrophic forgetting'' problem when updating the joint classification model on the arrival of newly added classes. To cope with the forgetting problem, many CIL methods transfer the knowledge of old classes by preserving some exemplar samples into the size-constrained memory buffer. To utilize the memory buffer more efficiently, we propose to keep more auxiliary low-fidelity exemplar samples, rather than the original real-high-fidelity exemplar samples. Such a memory-efficient exemplar preserving scheme makes the old-class knowledge transfer more effective. However, the low-fidelity exemplar samples are often distributed in a different domain away from that of the original exemplar samples, that is, a domain shift. To alleviate this problem, we propose a duplet learning scheme that seeks to construct domain-compatible feature extractors and classifiers, which greatly narrows down the above domain gap. As a result, these low-fidelity auxiliary exemplar samples have the ability to moderately replace the original exemplar samples with a lower memory cost. In addition, we present a robust classifier adaptation scheme, which further refines the biased classifier (learned with the samples containing distillation label knowledge about old classes) with the help of the samples of pure true class labels. Experimental results demonstrate the effectiveness of this work against the state-of-the-art approaches. We will release the code, baselines, and training statistics for all models to facilitate future research.
引用
下载
收藏
页码:5966 / 5977
页数:12
相关论文
共 50 条
  • [31] Semantic Knowledge Guided Class-Incremental Learning
    Wang, Shaokun
    Shi, Weiwei
    Dong, Songlin
    Gao, Xinyuan
    Song, Xiang
    Gong, Yihong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (10) : 5921 - 5931
  • [32] Semantic Drift Compensation for Class-Incremental Learning
    Yu, Lu
    Twardowski, Bartlomiej
    Liu, Xialei
    Herranz, Luis
    Wang, Kai
    Cheng, Yongmei
    Jui, Shangling
    van de Weijer, Joost
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 6980 - 6989
  • [33] Adaptive Aggregation Networks for Class-Incremental Learning
    Liu, Yaoyao
    Schiele, Bernt
    Sun, Qianru
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 2544 - 2553
  • [34] Audio-Visual Class-Incremental Learning
    Pian, Weiguo
    Mo, Shentong
    Guo, Yunhui
    Tian, Yapeng
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 7765 - 7777
  • [35] Class-Incremental Learning via Dual Augmentation
    Zhu, Fei
    Cheng, Zhen
    Zhang, Xu-Yao
    Liu, Cheng-Lin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [36] Generative Feature Replay For Class-Incremental Learning
    Liu, Xialei
    Wu, Chenshen
    Menta, Mikel
    Herranz, Luis
    Raducanu, Bogdan
    Bagdanov, Andrew D.
    Jui, Shangling
    van de Weijer, Joost
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 915 - 924
  • [37] Model Behavior Preserving for Class-Incremental Learning
    Liu, Yu
    Hong, Xiaopeng
    Tao, Xiaoyu
    Dong, Songlin
    Shi, Jingang
    Gong, Yihong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (10) : 7529 - 7540
  • [38] Class-Incremental Learning based on Label Generation
    Shao, Yijia
    Guo, Yiduo
    Zhao, Dongyan
    Liu, Bing
    61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1263 - 1276
  • [39] Class-Incremental Learning via Knowledge Amalgamation
    de Carvalho, Marcus
    Pratama, Mahardhika
    Zhang, Jie
    Sun, Yajuan
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT III, 2023, 13715 : 36 - 50
  • [40] Class-incremental learning with causal relational replay
    Nguyen, Toan
    Kieu, Duc
    Duong, Bao
    Kieu, Tung
    Do, Kien
    Nguyen, Thin
    Le, Bac
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250