Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation

被引:4
|
作者
Golden, Ryan [1 ,2 ]
Delanois, Jean Erik [2 ,3 ]
Sanda, Pavel [4 ]
Bazhenov, Maxim [1 ,2 ]
机构
[1] Univ Calif San Diego, Neurosci Grad Program, La Jolla, CA 92093 USA
[2] Univ Calif San Diego, Dept Med, La Jolla, CA 92093 USA
[3] Univ Calif San Diego, Dept Comp Sci & Engn, La Jolla, CA 92093 USA
[4] Czech Acad Sci, Inst Comp Sci, Prague, Czech Republic
关键词
MEMORY CONSOLIDATION; CONNECTIONIST MODELS; FEEDBACK-CIRCUITS; PREFRONTAL CORTEX; REPLAY; REACTIVATION; FEEDFORWARD; PREDICTION; OSCILLATIONS; WAKEFULNESS;
D O I
10.1371/journal.pcbi.1010628
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning. Author summary Artificial neural networks can achieve superhuman performance in many domains. Despite these advances, these networks fail in sequential learning; they achieve optimal performance on newer tasks at the expense of performance on previously learned tasks. Humans and animals on the other hand have a remarkable ability to learn continuously and incorporate new data into their corpus of existing knowledge. Sleep has been hypothesized to play an important role in memory and learning by enabling spontaneous reactivation of previously learned memory patterns. Here we use a spiking neural network model, simulating sensory processing and reinforcement learning in animal brain, to demonstrate that interleaving new task training with sleep-like activity optimizes the network's memory representation in synaptic weight space to prevent forgetting old memories. Sleep makes this possible by replaying old memory traces without the explicit usage of the old task data.
引用
收藏
页数:31
相关论文
共 50 条
  • [41] Reduction of Catastrophic Forgetting in Multilayer Neural Networks Trained by Contrastive Hebbian Learning with Pseudorehearsal
    Nakano, Shunta
    Hattori, Motonobu
    2017 IEEE 10TH INTERNATIONAL WORKSHOP ON COMPUTATIONAL INTELLIGENCE AND APPLICATIONS (IWCIA), 2017, : 91 - 95
  • [42] Preventing Catastrophic Forgetting using Prior Transfer in Physics Informed Bayesian Neural Networks
    Van Heck, Cedric
    Coene, Annelies
    Crevecoeur, Guillaume
    2022 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2022, : 650 - 657
  • [43] Hardware Efficient Weight-Binarized Spiking Neural Networks
    Tang, Chengcheng
    Han, Jie
    2023 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION, DATE, 2023,
  • [44] Synaptic delay plasticity based on frequency-switched VCSELs for optical delay-weight spiking neural networks
    Lu, Yao
    Zhang, Wenjia
    Fu, Bangqi
    DU, Jiangbing
    He, Zuyuan
    OPTICS LETTERS, 2022, 47 (21) : 5587 - 5590
  • [45] Spiking neural networks compensate for weight drift in organic neuromorphic device networks
    Felder, Daniel
    Linkhorst, John
    Wessling, Matthias
    NEUROMORPHIC COMPUTING AND ENGINEERING, 2023, 3 (02):
  • [46] Enhanced representation learning with temporal coding in sparsely spiking neural networks
    Fois, Adrien
    Girau, Bernard
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2023, 17
  • [47] Graphical representation of data for a multiprocessor array emulating spiking neural networks
    Sokolnicki, Adam
    Sanchez, Giovanny
    Madrenas, Jordi
    Moreno, Manuel
    Sakowicz, Bartosz
    PRZEGLAD ELEKTROTECHNICZNY, 2012, 88 (11A):
  • [48] Unknown Environment Representation for Mobile Robot Using Spiking Neural Networks
    Alamdari, Amir Reza Saffari Azar
    PROCEEDINGS OF WORLD ACADEMY OF SCIENCE, ENGINEERING AND TECHNOLOGY, VOL 6, 2005, : 49 - 52
  • [49] Memristor-based synaptic plasticity and unsupervised learning of spiking neural networks
    Zohreh Hajiabadi
    Majid Shalchian
    Journal of Computational Electronics, 2021, 20 : 1625 - 1636
  • [50] Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation
    Comsa, Iulia-Maria
    Potempa, Krzysztof
    Versari, Luca
    Fischbacher, Thomas
    Gesmundo, Andrea
    Alakuijala, Jyrki
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5939 - 5952