A Study on Catastrophic Forgetting in Deep LSTM Networks

被引:17
|
作者
Schak, Monika [1 ]
Gepperth, Alexander [1 ]
机构
[1] Univ Appl Sci Fulda, D-36037 Fulda, Germany
来源
ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II | 2019年 / 11728卷
关键词
LSTM; Catastrophic Forgetting;
D O I
10.1007/978-3-030-30484-3_56
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present a systematic study of Catastrophic Forgetting (CF), i.e., the abrupt loss of previously acquired knowledge, when retraining deep recurrent LSTM networks with new samples. CF has recently received renewed attention in the case of feed-forward DNNs, and this article is the first work that aims to rigorously establish whether deep LSTM networks are afflicted by CF as well, and to what degree. In order to test this fully, training is conducted using a wide variety of high-dimensional image-based sequence classification tasks derived from established visual classification benchmarks (MNIST, Devanagari, FashionMNIST and EMNIST). We find that the CF effect occurs universally, without exception, for deep LSTM-based sequence classifiers, regardless of the construction and provenance of sequences. This leads us to conclude that LSTMs, just like DNNs, are fully affected by CF, and that further research work needs to be conducted in order to determine how to avoid this effect (which is not a goal of this study).
引用
收藏
页码:714 / 728
页数:15
相关论文
共 50 条
  • [21] Avoiding Catastrophic Forgetting
    Hasselmo, Michael E.
    TRENDS IN COGNITIVE SCIENCES, 2017, 21 (06) : 407 - 408
  • [22] A deep-feature based estimation algorithm (DFEA) for catastrophic forgetting
    Thanikkal J.G.
    Dubey A.K.
    Thomas M.T.
    Journal of Ambient Intelligence and Humanized Computing, 2023, 14 (12) : 16771 - 16784
  • [23] Mitigating Catastrophic Forgetting in Deep Transfer Learning for Fingerprinting Indoor Positioning
    Pan, Heng
    Wei, Shuang
    He, Di
    Xiao, Zhuoling
    Arai, Shintaro
    2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [24] Deep Face Detector Adaptation without Negative Transfer or Catastrophic Forgetting
    Jamal, Muhammad Abdullah
    Li, Haoxiang
    Gong, Boqing
    2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 5608 - 5618
  • [25] Continual Deep Reinforcement Learning to Prevent Catastrophic Forgetting in Jamming Mitigation
    Nexcepta, Gaithersburg
    MD, United States
    Proc IEEE Mil Commun Conf MILCOM, 2024, (740-745):
  • [26] Continual Deep Reinforcement Learning to Prevent Catastrophic Forgetting in Jamming Mitigation
    Davaslioglu, Kemal
    Kompella, Sastry
    Erpek, Tugba
    Sagduyu, Yalin E.
    arXiv,
  • [27] Ensemble Learning in Fixed Expansion Layer Networks for Mitigating Catastrophic Forgetting
    Coop, Robert
    Mishtal, Aaron
    Arel, Itamar
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2013, 24 (10) : 1623 - 1634
  • [28] Reducing Catastrophic Forgetting in Neural Networks via Gaussian Mixture Approximation
    Hoang Phan
    Anh Phan Tuan
    Son Nguyen
    Ngo Van Linh
    Khoat Than
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT I, 2022, 13280 : 106 - 117
  • [29] Rotate your Networks: Better Weight Consolidation and Less Catastrophic Forgetting
    Liu, Xialei
    Masana, Marc
    Herranz, Luis
    Van de Weijer, Joost
    Lopez, Antonio M.
    Bagdanov, Andrew D.
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2262 - 2268
  • [30] How catastrophic can catastrophic forgetting be in linear regression?
    Evron, Itay
    Moroshko, Edward
    Ward, Rachel
    Srebro, Nati
    Soudry, Daniel
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178