Mitigating Catastrophic Forgetting with Complementary Layered Learning

被引:1
|
作者
Mondesire, Sean [1 ]
Wiegand, R. Paul [2 ]
机构
[1] Univ Cent Florida, Inst Simulat & Training, Orlando, FL 32826 USA
[2] Winthrop Univ, Dept Comp Sci & Quantitat Methods, Rock Hill, SC 29733 USA
关键词
layered learning; transfer learning; catastrophic forgetting; multi-agent system; BEHAVIORS;
D O I
10.3390/electronics12030706
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting is a stability-plasticity imbalance that causes a machine learner to lose previously gained knowledge that is critical for performing a task. The imbalance occurs in transfer learning, negatively affecting the learner's performance, particularly in neural networks and layered learning. This work proposes a complementary learning technique that introduces long- and short-term memory to layered learning to reduce the negative effects of catastrophic forgetting. In particular, this work proposes the dual memory system in the non-neural network approaches of evolutionary computation and Q-learning instances of layered learning because these techniques are used to develop decision-making capabilities for physical robots. Experiments evaluate the new learning augmentation in a multi-agent system simulation, where autonomous unmanned aerial vehicles learn to collaborate and maneuver to survey an area effectively. Through these direct-policy and value-based learning experiments, the proposed complementary layered learning is demonstrated to significantly improve task performance over standard layered learning, successfully balancing stability and plasticity.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks
    Munoz-Martin, Irene
    Bianchi, Stefano
    Pedretti, Giacomo
    Melnic, Octavian
    Ambrogio, Stefano
    Ielmini, Daniele
    [J]. IEEE JOURNAL ON EXPLORATORY SOLID-STATE COMPUTATIONAL DEVICES AND CIRCUITS, 2019, 5 (01): : 58 - 66
  • [22] Knowledge Lock: Overcoming Catastrophic Forgetting in Federated Learning
    Wei, Guoyizhe
    Li, Xiu
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT I, 2022, 13280 : 601 - 612
  • [23] Incremental Learning of Object Detectors without Catastrophic Forgetting
    Shmelkov, Konstantin
    Schmid, Cordelia
    Alahari, Karteek
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 3420 - 3429
  • [24] Generalisable deep Learning framework to overcome catastrophic forgetting
    Alammar, Zaenab
    Alzubaidi, Laith
    Zhang, Jinglan
    Li, Yuefeng
    Gupta, Ashish
    Gu, Yuantong
    [J]. INTELLIGENT SYSTEMS WITH APPLICATIONS, 2024, 23
  • [25] Mitigating Forgetting in Online Continual Learning with Neuron Calibration
    Yin, Haiyan
    Yang, Peng
    Li, Ping
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [26] Avoiding Catastrophic Forgetting
    Hasselmo, Michael E.
    [J]. TRENDS IN COGNITIVE SCIENCES, 2017, 21 (06) : 407 - 408
  • [27] Overcoming Catastrophic Forgetting for Semantic Segmentation Via Incremental Learning
    Yang, Yizhuo
    Yuan, Shenghai
    Xie, Lihua
    [J]. 2022 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV), 2022, : 299 - 304
  • [28] Overcoming Catastrophic Forgetting Using Sparse Coding and Meta Learning
    Hurtado, Julio
    Lobel, Hans
    Soto, Alvaro
    [J]. IEEE ACCESS, 2021, 9 : 88279 - 88290
  • [29] Preempting Catastrophic Forgetting in Continual Learning Models by Anticipatory Regularization
    El Khatib, Alaa
    Karray, Fakhri
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [30] Understanding Catastrophic Forgetting of Gated Linear Networks in Continual Learning
    Munari, Matteo
    Pasa, Luca
    Zambon, Daniele
    Alippi, Cesare
    Navarin, Nicolo
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,