Mitigating Catastrophic Forgetting with Complementary Layered Learning

被引:1
|
作者
Mondesire, Sean [1 ]
Wiegand, R. Paul [2 ]
机构
[1] Univ Cent Florida, Inst Simulat & Training, Orlando, FL 32826 USA
[2] Winthrop Univ, Dept Comp Sci & Quantitat Methods, Rock Hill, SC 29733 USA
关键词
layered learning; transfer learning; catastrophic forgetting; multi-agent system; BEHAVIORS;
D O I
10.3390/electronics12030706
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Catastrophic forgetting is a stability-plasticity imbalance that causes a machine learner to lose previously gained knowledge that is critical for performing a task. The imbalance occurs in transfer learning, negatively affecting the learner's performance, particularly in neural networks and layered learning. This work proposes a complementary learning technique that introduces long- and short-term memory to layered learning to reduce the negative effects of catastrophic forgetting. In particular, this work proposes the dual memory system in the non-neural network approaches of evolutionary computation and Q-learning instances of layered learning because these techniques are used to develop decision-making capabilities for physical robots. Experiments evaluate the new learning augmentation in a multi-agent system simulation, where autonomous unmanned aerial vehicles learn to collaborate and maneuver to survey an area effectively. Through these direct-policy and value-based learning experiments, the proposed complementary layered learning is demonstrated to significantly improve task performance over standard layered learning, successfully balancing stability and plasticity.
引用
收藏
页数:17
相关论文
共 50 条
  • [41] Continual Deep Reinforcement Learning to Prevent Catastrophic Forgetting in Jamming Mitigation
    Davaslioglu, Kemal
    Kompella, Sastry
    Erpek, Tugba
    Sagduyu, Yalin E.
    [J]. arXiv,
  • [42] Preventing Catastrophic Forgetting in Continual Learning of New Natural Language Tasks
    Kar, Sudipta
    Castellucci, Giuseppe
    Filice, Simone
    Malmasi, Shervin
    Rokhlenko, Oleg
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 3137 - 3145
  • [43] Catastrophic forgetting in connectionist networks
    French, RM
    [J]. TRENDS IN COGNITIVE SCIENCES, 1999, 3 (04) : 128 - 135
  • [44] Solutions to the catastrophic forgetting problem
    Robins, A
    [J]. PROCEEDINGS OF THE TWENTIETH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1998, : 899 - 904
  • [45] Statistical Mechanical Analysis of Catastrophic Forgetting in Continual Learning with Teacher and Student Networks
    Asanuma, Haruka
    Takagi, Shiro
    Nagano, Yoshihiro
    Yoshida, Yuki
    Igarashi, Yasuhiko
    Okada, Masato
    [J]. JOURNAL OF THE PHYSICAL SOCIETY OF JAPAN, 2021, 90 (10)
  • [46] Pseudo-rehearsal: Achieving deep reinforcement learning without catastrophic forgetting
    Atkinson, Craig
    McCane, Brendan
    Szymanski, Lech
    Robins, Anthony
    [J]. NEUROCOMPUTING, 2021, 428 : 291 - 307
  • [47] Artificial Neural Variability for Deep Learning: On Overfitting, Noise Memorization, and Catastrophic Forgetting
    Xie, Zeke
    He, Fengxiang
    Fu, Shaopeng
    Sato, Issei
    Tao, Dacheng
    Sugiyama, Masashi
    [J]. NEURAL COMPUTATION, 2021, 33 (08) : 2163 - 2192
  • [48] Adversarial Feature Alignment: Avoid Catastrophic Forgetting in Incremental Task Lifelong Learning
    Yao, Xin
    Huang, Tianchi
    Wu, Chenglei
    Zhang, Rui-Xiao
    Sun, Lifeng
    [J]. NEURAL COMPUTATION, 2019, 31 (11) : 2266 - 2291
  • [49] Psycholinguistics meets Continual Learning: Measuring Catastrophic Forgetting in Visual Question Answering
    Greco, Claudio
    Plank, Barbara
    Fernandez, Raquel
    Bernardi, Raffaella
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3601 - 3605
  • [50] Overcoming catastrophic forgetting with classifier expander
    Liu, Xinchen
    Wang, Hongbo
    Tian, Yingjian
    Xie, Linyao
    [J]. ASIAN CONFERENCE ON MACHINE LEARNING, VOL 222, 2023, 222