Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks

被引:0
|
作者
Zhang, Wenrui [1 ]
Li, Peng [1 ]
机构
[1] Univ Calif Santa Barbara, Santa Barbara, CA 93106 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) well support spatio-temporal learning and energy-efficient event-driven hardware neuromorphic processors. As an important class of SNNs, recurrent spiking neural networks (RSNNs) possess great computational power. However, the practical application of RSNNs is severely limited by challenges in training. Biologically-inspired unsupervised learning has limited capability in boosting the performance of RSNNs. On the other hand, existing backpropagation (BP) methods suffer from high complexity of unfolding in time, vanishing and exploding gradients, and approximate differentiation of discontinuous spiking activities when applied to RSNNs. To enable supervised training of RSNNs under a well-defined loss function, we present a novel Spike-Train level RSNNs Backpropagation (ST-RSBP) algorithm for training deep RSNNs. The proposed ST-RSBP directly computes the gradient of a rate-coded loss function defined at the output layer of the network w.r.t tunable parameters. The scalability of ST-RSBP is achieved by the proposed spike-train level computation during which temporal effects of the SNN is captured in both the forward and backward pass of BP. Our ST-RSBP algorithm can be broadly applied to RSNNs with a single recurrent layer or deep RSNNs with multiple feedforward and recurrent layers. Based upon challenging speech and image datasets including TI46 [25], N-TIDIGITS [3], Fashion-MNIST [40] and MNIST, ST-RSBP is able to train SNNs with an accuracy surpassing that of the current state-of-the-art SNN BP algorithms and conventional non-spiking deep learning models.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Spike-Train Level Direct Feedback Alignment: Sidestepping Backpropagation for On-Chip Training of Spiking Neural Nets
    Lee, Jeongjun
    Zhang, Renqian
    Zhang, Wenrui
    Liu, Yu
    Li, Peng
    [J]. FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [2] Spike-Train Level Unsupervised Learning Algorithm for Deep Spiking Belief Networks
    Lin, Xianghong
    Du, Pangao
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 634 - 645
  • [3] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    [J]. FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [4] Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks
    Jin, Yingyezhe
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] STCSNN: High energy efficiency spike-train level spiking neural networks with spatio-temporal conversion
    Xu, Changqing
    Liu, Yi
    Yang, Yintang
    [J]. NEUROCOMPUTING, 2024, 607
  • [6] A SIMPLE DIGITAL SPIKING NEURAL NETWORK: SYNCHRONIZATION AND SPIKE-TRAIN APPROXIMATION
    Uchida, Hiroaki
    Oishi, Yuya
    Saito, Toshimichi
    [J]. DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS-SERIES S, 2021, 14 (04): : 1479 - 1494
  • [7] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [8] Basic spike-train properties of a digital spiking neuron
    Torikai, Hiroyuki
    [J]. DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS-SERIES B, 2008, 9 (01): : 183 - 198
  • [9] Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks
    Shen, Guobin
    Zhao, Dongcheng
    Zeng, Yi
    [J]. PATTERNS, 2022, 3 (06):
  • [10] Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2020, 30 (06)