A Study on Catastrophic Forgetting in Deep LSTM Networks

被引:17
|
作者
Schak, Monika [1 ]
Gepperth, Alexander [1 ]
机构
[1] Univ Appl Sci Fulda, D-36037 Fulda, Germany
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
关键词
LSTM; Catastrophic Forgetting;
D O I
10.1007/978-3-030-30484-3_56
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a systematic study of Catastrophic Forgetting (CF), i.e., the abrupt loss of previously acquired knowledge, when retraining deep recurrent LSTM networks with new samples. CF has recently received renewed attention in the case of feed-forward DNNs, and this article is the first work that aims to rigorously establish whether deep LSTM networks are afflicted by CF as well, and to what degree. In order to test this fully, training is conducted using a wide variety of high-dimensional image-based sequence classification tasks derived from established visual classification benchmarks (MNIST, Devanagari, FashionMNIST and EMNIST). We find that the CF effect occurs universally, without exception, for deep LSTM-based sequence classifiers, regardless of the construction and provenance of sequences. This leads us to conclude that LSTMs, just like DNNs, are fully affected by CF, and that further research work needs to be conducted in order to determine how to avoid this effect (which is not a goal of this study).
引用
收藏
页码:714 / 728
页数:15
相关论文
共 50 条
  • [41] Mitigation of Catastrophic Forgetting in Recurrent Neural Networks using a Fixed Expansion Layer
    Coop, Robert
    Arel, Itamar
    2013 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2013,
  • [42] A comparative study and analysis of LSTM deep neural networks for heartbeats classification
    Srinidhi Hiriyannaiah
    Siddesh G M
    Kiran M H M
    K G Srinivasa
    Health and Technology, 2021, 11 : 663 - 671
  • [43] A comparative study and analysis of LSTM deep neural networks for heartbeats classification
    Hiriyannaiah, Srinidhi
    Siddesh, G. M.
    Kiran, M. H. M.
    Srinivasa, K. G.
    HEALTH AND TECHNOLOGY, 2021, 11 (03) : 663 - 671
  • [44] Handling catastrophic forgetting using cross-domain order in incremental deep learning
    Kumar, Ashutosh
    Agarwal, Sonali
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (02)
  • [45] Overcoming catastrophic forgetting with classifier expander
    Liu, Xinchen
    Wang, Hongbo
    Tian, Yingjian
    Xie, Linyao
    ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222
  • [46] Mitigate Catastrophic Forgetting by Varying Goals
    Chen, Lu
    Masayuki, Murata
    ICAART: PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE, VOL 2, 2020, : 530 - 537
  • [47] Is catastrophic forgetting Bayes-optimal?
    Sajid, Noor
    Convertino, Laura
    Neacsu, Victorita
    Parr, Thomas
    Friston, Karl
    JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2023, 51 : S42 - S43
  • [48] The Role Weights Play in Catastrophic Forgetting
    Hintze, Arend
    2021 8TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2021), 2021, : 160 - 166
  • [49] Combating catastrophic forgetting with developmental compression
    Beaulieu, Shawn L. E.
    Kriegman, Sam
    Bongard, Josh C.
    GECCO'18: PROCEEDINGS OF THE 2018 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2018, : 386 - 393
  • [50] Catastrophic forgetting and mode collapse in GANs
    Hoang Thanh-Tung
    Truyen Tran
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,