Overcoming Catastrophic Forgetting Using Sparse Coding and Meta Learning

被引:8
|
作者
Hurtado, Julio [1 ]
Lobel, Hans [1 ,2 ]
Soto, Alvaro [1 ]
机构
[1] Pontificia Univ Catolica Chile, Dept Comp Sci, Santiago 7820436, Chile
[2] Pontificia Univ Catolica Chile, Dept Transport Engn & Logist, Santiago 782436, Chile
关键词
Task analysis; Interference; Training; Knowledge transfer; Adaptation models; Data models; Context modeling; Artificial intelligence; learning (artificial intelligence); machine learning; supervised learning; continual learning;
D O I
10.1109/ACCESS.2021.3090672
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Continuous learning occurs naturally in human beings. However, Deep Learning methods suffer from a problem known as Catastrophic Forgetting (CF) that consists of a model drastically decreasing its performance on previously learned tasks when it is sequentially trained on new tasks. This situation, known as task interference, occurs when a network modifies relevant weight values as it learns a new task. In this work, we propose two main strategies to face the problem of task interference in convolutional neural networks. First, we use a sparse coding technique to adaptively allocate model capacity to different tasks avoiding interference between them. Specifically, we use a strategy based on group sparse regularization to specialize groups of parameters to learn each task. Afterward, by adding binary masks, we can freeze these groups of parameters, using the rest of the network to learn new tasks. Second, we use a meta learning technique to foster knowledge transfer among tasks, encouraging weight reusability instead of overwriting. Specifically, we use an optimization strategy based on episodic training to foster learning weights that are expected to be useful to solve future tasks. Together, these two strategies help us to avoid interference by preserving compatibility with previous and future weight values. Using this approach, we achieve state-of-the-art results on popular benchmarks used to test techniques to avoid CF. In particular, we conduct an ablation study to identify the contribution of each component of the proposed method, demonstrating its ability to avoid retroactive interference with previous tasks and to promote knowledge transfer to future tasks.
引用
收藏
页码:88279 / 88290
页数:12
相关论文
共 50 条
  • [31] Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation
    Feng, Tao
    Wang, Mang
    Yuan, Hangjie
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9417 - 9426
  • [32] State Primitive Learning to Overcome Catastrophic Forgetting in Robotics
    Xiong, Fangzhou
    Liu, Zhiyong
    Huang, Kaizhu
    Yang, Xu
    Qiao, Hong
    [J]. COGNITIVE COMPUTATION, 2021, 13 (02) : 394 - 402
  • [33] State Primitive Learning to Overcome Catastrophic Forgetting in Robotics
    Fangzhou Xiong
    Zhiyong Liu
    Kaizhu Huang
    Xu Yang
    Hong Qiao
    [J]. Cognitive Computation, 2021, 13 : 394 - 402
  • [34] An Efficient Strategy for Catastrophic Forgetting Reduction in Incremental Learning
    Doan, Huong-Giang
    Luong, Hong-Quan
    Ha, Thi-Oanh
    Pham, Thi Thanh Thuy
    [J]. ELECTRONICS, 2023, 12 (10)
  • [35] Attenuating Catastrophic Forgetting by Joint Contrastive and Incremental Learning
    Ferdinand, Quentin
    Clement, Benoit
    Oliveau, Quentin
    Le Chenadec, Gilles
    Papadakis, Panagiotis
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 3781 - 3788
  • [36] Continual Learning for Instance Segmentation to Mitigate Catastrophic Forgetting
    Lee, Jeong Jun
    Lee, Seung Il
    Kim, Hyun
    [J]. 18TH INTERNATIONAL SOC DESIGN CONFERENCE 2021 (ISOCC 2021), 2021, : 85 - 86
  • [37] Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks
    Munoz-Martin, Irene
    Bianchi, Stefano
    Pedretti, Giacomo
    Melnic, Octavian
    Ambrogio, Stefano
    Ielmini, Daniele
    [J]. IEEE JOURNAL ON EXPLORATORY SOLID-STATE COMPUTATIONAL DEVICES AND CIRCUITS, 2019, 5 (01): : 58 - 66
  • [38] Incremental Learning of Object Detectors without Catastrophic Forgetting
    Shmelkov, Konstantin
    Schmid, Cordelia
    Alahari, Karteek
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 3420 - 3429
  • [39] Generalisable deep Learning framework to overcome catastrophic forgetting
    Alammar, Zaenab
    Alzubaidi, Laith
    Zhang, Jinglan
    Li, Yuefeng
    Gupta, Ashish
    Gu, Yuantong
    [J]. INTELLIGENT SYSTEMS WITH APPLICATIONS, 2024, 23
  • [40] Using Flexible Memories to Reduce Catastrophic Forgetting
    Wong, Wernsen
    Koh, Yun Sing
    Dobbie, Gillian
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT II, 2023, 13936 : 219 - 230