Dynamic Memory-Based Continual Learning with Generating and Screening

被引:0
|
作者
Tao, Siying [1 ]
Huang, Jinyang [1 ]
Zhang, Xiang [2 ]
Sun, Xiao [1 ,3 ]
Gu, Yu [4 ]
机构
[1] Hefei Univ Technol, Sch Comp Sci & Informat Engn, Hefei, Peoples R China
[2] Univ Sci & Technol China, Sch Cybers Sci & Technol, Hefei, Peoples R China
[3] Hefei Comprehens Natl Sci Ctr, Inst Artificial Intelligence, Hefei, Peoples R China
[4] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, I Lab, Chengdu, Peoples R China
关键词
Continual Learning; Generative replay; Deep learning;
D O I
10.1007/978-3-031-44213-1_31
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks suffer from catastrophic forgetting when continually learning new tasks. Although simply replaying all previous data alleviates the problem, it requires large memory and even worse, often infeasible in real-world applications where access to past data is limited. Therefore, We propose a two-stage framework that dynamically reproduces data features of previous tasks to reduce catastrophic forgetting. Specifically, at each task step, we use a new memory module to learn the data distribution of the new task and reproduce pseudo-data from previous memory modules to learn together. This enables us to integrate new visual concepts with retaining learned knowledge to achieve a better stability-malleability balance. We introduce an N-step model fusion strategy to accelerate the memorization process of the memory module and a screening strategy to control the quantity and quality of generated data, reducing distribution differences. We experimented on CIFAR-100, MNIST, and SVHN datasets to demonstrate the effectiveness of our method.
引用
收藏
页码:365 / 376
页数:12
相关论文
共 50 条
  • [31] MEMORY-BASED PEDESTRIAN DETECTION THROUGH SEQUENCE LEARNING
    Li, Xudong
    Ye, Mao
    Liu, Yiguang
    Zhu, Ce
    2017 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2017, : 1129 - 1134
  • [32] Flattening Sharpness for Dynamic Gradient Projection Memory Benefits Continual Learning
    Deng, Danruo
    Chen, Guangyong
    Hao, Jianye
    Wang, Qiong
    Pheng-Ann Heng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [33] Dynamic memory to alleviate catastrophic forgetting in continual learning with medical imaging
    Matthias Perkonigg
    Johannes Hofmanninger
    Christian J. Herold
    James A. Brink
    Oleg Pianykh
    Helmut Prosch
    Georg Langs
    Nature Communications, 12
  • [34] Dynamic memory to alleviate catastrophic forgetting in continual learning with medical imaging
    Perkonigg, Matthias
    Hofmanninger, Johannes
    Herold, Christian J.
    Brink, James A.
    Pianykh, Oleg
    Prosch, Helmut
    Langs, Georg
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [35] Reinforcement Learning Using a Stochastic Gradient Method with Memory-Based Learning
    Yamada, Takafumi
    Yamaguchi, Satoshi
    ELECTRICAL ENGINEERING IN JAPAN, 2010, 173 (01) : 32 - 40
  • [36] Generating Accurate Pseudo Examples for Continual Learning
    Silver, Daniel L.
    Mahfuz, Sazia
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), 2020, : 1035 - 1042
  • [37] A Memory-Based Modular Architecture for SOM and LVQ with Dynamic Configuration
    An, Fengwei
    Zhang, Xiangyu
    Chen, Lei
    Mattausch, Hans Jurgen
    IEEE TRANSACTIONS ON MULTI-SCALE COMPUTING SYSTEMS, 2016, 2 (04): : 234 - 241
  • [38] Memory-based CHC Algorithms for the Dynamic Traveling Salesman Problem
    Simoes, Anabela
    Costa, Ernesto
    GECCO-2011: PROCEEDINGS OF THE 13TH ANNUAL GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2011, : 1037 - 1044
  • [39] MEMORY-BASED PARSING
    LEBOWITZ, M
    ARTIFICIAL INTELLIGENCE, 1983, 21 (04) : 363 - 404
  • [40] Dynamic Consolidation for Continual Learning
    Li, Hang
    Ma, Chen
    Chen, Xi
    Liu, Xue
    NEURAL COMPUTATION, 2023, 35 (02) : 228 - 248