Class-Incremental Learning using Diffusion Model for Distillation and Replay

被引:0
|
作者
Jodelet, Quentin [1 ,2 ]
Liu, Xin [2 ]
Phua, Yin Jun [1 ]
Murata, Tsuyoshi [1 ,2 ]
机构
[1] Tokyo Inst Technol, Dept Comp Sci, Tokyo, Japan
[2] AIST, Artificial Intelligence Res Ctr, AIST, Japan
关键词
D O I
10.1109/ICCVW60793.2023.00367
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Class-incremental learning aims to learn new classes in an incremental fashion without forgetting the previously learned ones. Several research works have shown how additional data can be used by incremental models to help mitigate catastrophic forgetting. In this work, following the recent breakthrough in text-to-image generative models and their wide distribution, we propose the use of a pretrained Stable Diffusion model as a source of additional data for class-incremental learning. Compared to competitive methods that rely on external, often unlabeled, datasets of real images, our approach can generate synthetic samples belonging to the same classes as the previously encountered images. This allows us to use those additional data samples not only in the distillation loss but also for replay in the classification loss. Experiments on the competitive benchmarks CIFAR100, ImageNet-Subset, and ImageNet demonstrate how this new approach can be used to further improve the performance of state-of-the-art methods for class-incremental learning on large scale datasets.
引用
收藏
页码:3417 / 3425
页数:9
相关论文
共 50 条
  • [1] Generative Feature Replay For Class-Incremental Learning
    Liu, Xialei
    Wu, Chenshen
    Menta, Mikel
    Herranz, Luis
    Raducanu, Bogdan
    Bagdanov, Andrew D.
    Jui, Shangling
    van de Weijer, Joost
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 915 - 924
  • [2] Class-incremental learning with causal relational replay
    Nguyen, Toan
    Kieu, Duc
    Duong, Bao
    Kieu, Tung
    Do, Kien
    Nguyen, Thin
    Le, Bac
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250
  • [3] Anchor Assisted Experience Replay for Online Class-Incremental Learning
    Lin, Huiwei
    Feng, Shanshan
    Li, Xutao
    Li, Wentao
    Ye, Yunming
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (05) : 2217 - 2232
  • [4] General Federated Class-Incremental Learning With Lightweight Generative Replay
    Chen, Yuanlu
    Tan, Alysa Ziying
    Feng, Siwei
    Yu, Han
    Deng, Tao
    Zhao, Libang
    Wu, Feng
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (20): : 33927 - 33939
  • [5] Class-Incremental Exemplar Compression for Class-Incremental Learning
    Luo, Zilin
    Liu, Yaoyao
    Schiele, Bernt
    Sun, Qianru
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 11371 - 11380
  • [6] Class-Incremental Experience Replay for Continual Learning under Concept Drift
    Korycki, Lukasz
    Krawczyk, Bartosz
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 3644 - 3653
  • [7] Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation
    Kang, Minsoo
    Park, Jaeyoo
    Han, Bohyung
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16050 - 16059
  • [8] MixER: Mixup-Based Experience Replay for Online Class-Incremental Learning
    Lim, Won-Seon
    Zhou, Yu
    Kim, Dae-Won
    Lee, Jaesung
    IEEE ACCESS, 2024, 12 : 41801 - 41814
  • [9] Model Behavior Preserving for Class-Incremental Learning
    Liu, Yu
    Hong, Xiaopeng
    Tao, Xiaoyu
    Dong, Songlin
    Shi, Jingang
    Gong, Yihong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (10) : 7529 - 7540
  • [10] CLASS-INCREMENTAL LEARNING FOR REMOTE SENSING IMAGES BASED ON KNOWLEDGE DISTILLATION
    Song, Jingduo
    Jia, Hecheng
    Xu, Feng
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 5026 - 5028