Measuring Catastrophic Forgetting in Neural Networks

被引:0
|
作者
Kemker, Ronald [1 ]
McClure, Marc [1 ]
Abitino, Angelina [2 ]
Hayes, Tyler L. [1 ]
Kanan, Christopher [1 ]
机构
[1] Rochester Inst Technol, Rochester, NY 14623 USA
[2] Swarthmore Coll, Swarthmore, PA 19081 USA
关键词
MODEL;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep neural networks are used in many state-of-the-art systems for machine perception. Once a network is trained to do a specific task, e.g., bird classification, it cannot easily be trained to do new tasks, e.g., incrementally learning to recognize additional bird species or learning an entirely different task such as flower recognition. When new tasks are added, typical deep neural networks are prone to catastrophically forgetting previous tasks. Networks that are capable of assimilating new information incrementally, much like how humans form new memories over time, will be more efficient than retraining the model from scratch each time a new task needs to be learned. There have been multiple attempts to develop schemes that mitigate catastrophic forgetting, but these methods have not been directly compared, the tests used to evaluate them vary considerably, and these methods have only been evaluated on small-scale problems (e.g., MNIST). In this paper, we introduce new metrics and benchmarks for directly comparing five different mechanisms designed to mitigate catastrophic forgetting in neural networks: regularization, ensembling, rehearsal, dual-memory, and sparse-coding. Our experiments on real-world images and sounds show that the mechanism(s) that are critical for optimal performance vary based on the incremental training paradigm and type of data being used, but they all demonstrate that the catastrophic forgetting problem is not yet solved.
引用
收藏
页码:3390 / 3398
页数:9
相关论文
共 50 条
  • [1] Overcoming catastrophic forgetting in neural networks
    Kirkpatricka, James
    Pascanu, Razvan
    Rabinowitz, Neil
    Veness, Joel
    Desjardins, Guillaume
    Rusu, Andrei A.
    Milan, Kieran
    Quan, John
    Ramalho, Tiago
    Grabska-Barwinska, Agnieszka
    Hassabis, Demis
    Clopath, Claudia
    Kumaran, Dharshan
    Hadsell, Raia
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2017, 114 (13) : 3521 - 3526
  • [2] Overcoming Catastrophic Forgetting in Graph Neural Networks
    Liu, Huihui
    Yang, Yiding
    Wang, Xinchao
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8653 - 8661
  • [3] Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks
    Munoz-Martin, Irene
    Bianchi, Stefano
    Pedretti, Giacomo
    Melnic, Octavian
    Ambrogio, Stefano
    Ielmini, Daniele
    [J]. IEEE JOURNAL ON EXPLORATORY SOLID-STATE COMPUTATIONAL DEVICES AND CIRCUITS, 2019, 5 (01): : 58 - 66
  • [4] Adaptation of artificial neural networks avoiding catastrophic forgetting
    Albesano, Dario
    Gernello, Roberto
    Laface, Pietro
    Mana, Franco
    Scanzio, Stefano
    [J]. 2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 1554 - 1561
  • [5] Evolving neural networks that suffer minimal catastrophic forgetting
    Seipone, T
    Bullinaria, JA
    [J]. MODELING LANGUAGE, COGNITION AND ACTION, 2005, 16 : 385 - 390
  • [6] Overcoming Catastrophic Forgetting in Graph Neural Networks with Experience Replay
    Zhou, Fan
    Cao, Chengtai
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4714 - 4722
  • [7] Neuron Clustering for Mitigating Catastrophic Forgetting in Feedforward Neural Networks
    Goodrich, Ben
    Arel, Itamar
    [J]. 2014 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE IN DYNAMIC AND UNCERTAIN ENVIRONMENTS (CIDUE), 2014, : 62 - 68
  • [8] Avoiding catastrophic forgetting by coupling two reverberating neural networks
    Ans, B
    Rousset, S
    [J]. COMPTES RENDUS DE L ACADEMIE DES SCIENCES SERIE III-SCIENCES DE LA VIE-LIFE SCIENCES, 1997, 320 (12): : 989 - 997
  • [9] Unsupervised Neuron Selection for Mitigating Catastrophic Forgetting in Neural Networks
    Goodrich, Ben
    Arel, Itamar
    [J]. 2014 IEEE 57TH INTERNATIONAL MIDWEST SYMPOSIUM ON CIRCUITS AND SYSTEMS (MWSCAS), 2014, : 997 - 1000
  • [10] Catastrophic forgetting in connectionist networks
    French, RM
    [J]. TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) : 128 - 135