Continual and One-Shot Learning Through Neural Networks with Dynamic External Memory

被引:13
|
作者
Luders, Benno [1 ]
Schlager, Mikkel [1 ]
Korach, Aleksandra [1 ]
Risi, Sebastian [1 ]
机构
[1] IT Univ Copenhagen, Copenhagen, Denmark
关键词
Neural Turing Machine; Continual learning; Adaptive neural networks; Plasticity; Memory; Neuroevolution;
D O I
10.1007/978-3-319-55849-3_57
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Training neural networks to quickly learn new skills without forgetting previously learned skills is an important open challenge in machine learning. A common problem for adaptive networks that can learn during their lifetime is that the weights encoding a particular task are often overridden when a new task is learned. This paper takes a step in overcoming this limitation by building on the recently proposed Evolving Neural Turing Machine (ENTM) approach. In the ENTM, neural networks are augmented with an external memory component that they can write to and read from, which allows them to store associations quickly and over long periods of time. The results in this paper demonstrate that the ENTM is able to perform one-shot learning in reinforcement learning tasks without catastrophic forgetting of previously stored associations. Additionally, we introduce a new ENTM default jump mechanism that makes it easier to find unused memory location and therefor facilitates the evolution of continual learning networks. Our results suggest that augmenting evolving networks with an external memory component is not only a viable mechanism for adaptive behaviors in neuroevolution but also allows these networks to perform continual and one-shot learning at the same time.
引用
收藏
页码:886 / 901
页数:16
相关论文
共 50 条
  • [1] An Approach to One-shot Identification with Neural Networks
    Mohr, Janis
    Breidenbach, Finn
    Frochte, Joerg
    [J]. PROCEEDINGS OF THE 13TH INTERNATIONAL JOINT CONFERENCE ON COMPUTATIONAL INTELLIGENCE (IJCCI), 2021, : 344 - 351
  • [2] Autonomous Driving Policy Continual Learning With One-Shot Disengagement Case
    Cao, Zhong
    Li, Xiang
    Jiang, Kun
    Zhou, Weitao
    Liu, Xiaoyu
    Deng, Nanshan
    Yang, Diange
    [J]. IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (02): : 1380 - 1391
  • [3] Improving associative memory capacity: One-shot learning in multilayer Hopfield networks
    Liwanag, A
    Becker, S
    [J]. PROCEEDINGS OF THE NINETEENTH ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1997, : 442 - 447
  • [4] Dynamic spectrum matching with one-shot learning
    Liu, Jinchao
    Gibson, Stuart J.
    Mills, James
    Osadchy, Margarita
    [J]. CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2019, 184 (175-181) : 175 - 181
  • [5] Memory Matching Networks for One-Shot Image Recognition
    Cai, Qi
    Pan, Yingwei
    Yao, Ting
    Yan, Chenggang
    Mei, Tao
    [J]. 2018 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2018, : 4080 - 4088
  • [6] One-shot memory in hippocampal CA3 networks
    Moser, EI
    Moser, MB
    [J]. NEURON, 2003, 38 (02) : 147 - 148
  • [7] A One-Shot Dynamic Optimization Methodology for Wireless Sensor Networks
    Munir, Arslan
    Gordon-Ross, Ann
    Lysecky, Susan
    Lysecky, Roman
    [J]. UBICOMM 2010: THE FOURTH INTERNATIONAL CONFERENCE ON MOBILE UBIQUITOUS COMPUTING, SYSTEMS, SERVICES AND TECHNOLOGIES, 2010, : 287 - 293
  • [8] One-Shot Imitation Learning
    Duan, Yan
    Andrychowicz, Marcin
    Stadie, Bradly
    Ho, Jonathan
    Schneider, Jonas
    Sutskeyer, Ilya
    Abbeel, Pieter
    Zaremba, Wojciech
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [9] Neural Computations Mediating One-Shot Learning in the Human Brain
    Lee, Sang Wan
    O'Doherty, John P.
    Shimojo, Shinsuke
    [J]. PLOS BIOLOGY, 2015, 13 (04)
  • [10] One-Shot Imitation Learning With Graph Neural Networks for Pick-and-Place Manipulation Tasks
    Di Felice, Francesco
    D'Avella, Salvatore
    Remus, Alberto
    Tripicchio, Paolo
    Avizzano, Carlo Alberto
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (09) : 5926 - 5933