Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation

被引:4
|
作者
Golden, Ryan [1 ,2 ]
Delanois, Jean Erik [2 ,3 ]
Sanda, Pavel [4 ]
Bazhenov, Maxim [1 ,2 ]
机构
[1] Univ Calif San Diego, Neurosci Grad Program, La Jolla, CA 92093 USA
[2] Univ Calif San Diego, Dept Med, La Jolla, CA 92093 USA
[3] Univ Calif San Diego, Dept Comp Sci & Engn, La Jolla, CA 92093 USA
[4] Czech Acad Sci, Inst Comp Sci, Prague, Czech Republic
关键词
MEMORY CONSOLIDATION; CONNECTIONIST MODELS; FEEDBACK-CIRCUITS; PREFRONTAL CORTEX; REPLAY; REACTIVATION; FEEDFORWARD; PREDICTION; OSCILLATIONS; WAKEFULNESS;
D O I
10.1371/journal.pcbi.1010628
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Artificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new training is interleaved with periods of sleep for memory consolidation. Here we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on different tasks. In synaptic weight space, new task training moved the synaptic weight configuration away from the manifold representing old task leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by constraining the network synaptic weight state to the previously learned manifold, while allowing the weight configuration to converge towards the intersection of the manifolds representing old and new tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning. Author summary Artificial neural networks can achieve superhuman performance in many domains. Despite these advances, these networks fail in sequential learning; they achieve optimal performance on newer tasks at the expense of performance on previously learned tasks. Humans and animals on the other hand have a remarkable ability to learn continuously and incorporate new data into their corpus of existing knowledge. Sleep has been hypothesized to play an important role in memory and learning by enabling spontaneous reactivation of previously learned memory patterns. Here we use a spiking neural network model, simulating sensory processing and reinforcement learning in animal brain, to demonstrate that interleaving new task training with sleep-like activity optimizes the network's memory representation in synaptic weight space to prevent forgetting old memories. Sleep makes this possible by replaying old memory traces without the explicit usage of the old task data.
引用
收藏
页数:31
相关论文
共 50 条
  • [31] Joint A-SNN: Joint training of artificial and spiking neural networks via self-Distillation and weight factorization
    Guo, Yufei
    Peng, Weihang
    Chen, Yuanpei
    Zhang, Liwen
    Liu, Xiaode
    Huang, Xuhui
    Ma, Zhe
    PATTERN RECOGNITION, 2023, 142
  • [32] Spiking neural networks for deep learning and knowledge representation: Editorial
    Kasabov, Nikola K.
    NEURAL NETWORKS, 2019, 119 : 341 - 342
  • [33] Biologically Plausible Learning of Text Representation with Spiking Neural Networks
    Bialas, Marcin
    Mironczuk, Marcin Michal
    Mandziuk, Jacek
    PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XVI, PT I, 2020, 12269 : 433 - 447
  • [34] SYNAPTIC ENERGY DRIVES THE INFORMATION PROCESSING MECHANISMS IN SPIKING NEURAL NETWORKS
    El Laithy, Karim
    Bogdan, Martin
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2014, 11 (02) : 233 - 256
  • [35] Capacitor-Based Synaptic Devices for Hardware Spiking Neural Networks
    Hwang, Sungmin
    Yu, Junsu
    Lee, Geun Ho
    Song, Min Suk
    Chang, Jeesoo
    Min, Kyung Kyu
    Jang, Taejin
    Lee, Jong-Ho
    Park, Byung-Gook
    Kim, Hyungjin
    IEEE ELECTRON DEVICE LETTERS, 2022, 43 (04) : 549 - 552
  • [36] Neurons With Captive Synaptic Devices for Temperature Robust Spiking Neural Networks
    Park, Kyungchul
    Kim, Sungjoon
    Baek, Myung-Hyun
    Jeon, Bosung
    Kim, Yeon-Woo
    Choi, Woo Young
    IEEE ELECTRON DEVICE LETTERS, 2024, 45 (03) : 492 - 495
  • [37] ASP: Learning to Forget With Adaptive Synaptic Plasticity in Spiking Neural Networks
    Panda, Priyadarshini
    Allred, Jason M.
    Ramanathan, Shriram
    Roy, Kaushik
    IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2018, 8 (01) : 51 - 64
  • [38] Supervised Learning in Spiking Neural Networks with Synaptic Delay Plasticity: An Overview
    Lan, Yawen
    Li, Qiang
    CURRENT BIOINFORMATICS, 2020, 15 (08) : 854 - 865
  • [39] Optoelectronic Memristor Model for Optical Synaptic Circuit of Spiking Neural Networks
    Xu, Jiawei
    Zheng, Yi
    Sheng, Chenxu
    Cai, Yichen
    Stathis, Dimitrios
    Shen, Ruisi
    Zheng, Li-Rong
    Zou, Zhuo
    Hu, Laigui
    Hemani, Ahmed
    2023 21ST IEEE INTERREGIONAL NEWCAS CONFERENCE, NEWCAS, 2023,
  • [40] Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks
    Velez, Roby
    Clune, Jeff
    PLOS ONE, 2017, 12 (11):