Dynamic layer-span connecting spiking neural networks with backpropagation training

被引:0
|
作者
Zijjian Wang
Yuxuan Huang
Yaqin Zhu
Binxing Xu
Long Chen
机构
[1] Donghua University,Department of Computer Science and Technology
[2] China Telecom Research Institute,undefined
来源
关键词
Spiking neural network; Surrogate gradient; Backpropagation; Brain-inspired computing; Brain-like computing;
D O I
暂无
中图分类号
学科分类号
摘要
Spiking Neural Network (SNN) is one of the mainstream frameworks for brain-like computing and neuromorphic computing, which has the potential to overcome current AI challenges, for example, low-power learning dynamic processes. However, there is still a huge gap in performance between SNN and artificial neural networks (ANN) in traditional supervised learning. One solution for this problem is to propose a better spiking neuron model to improve its memory ability for temporal data. This paper proposed a leaky integrate-and-fire (LIF) neuron model with dynamic postsynaptic potential and a layer-span connecting method for SNN trained using backpropagation. The dynamic postsynaptic potential LIF model allows the neurons dynamically release neurotransmitters in an SNN model, which mimics the activity of biological neurons. The layer-span connecting method enhances the long-distance memory ability of SNN. We also first introduced a cosh-based surrogate gradient for the backpropagation training of SNNs. We compared the SNN with cosh-based surrogate gradient (CSNN), CSNN with dynamic postsynaptic potential (Dyn-CSNN), layer-span connecting CSNN (Las-CSNN), and SNN model with all the proposed methods (DlaCSNN-BP) in three image classification and one text classification datasets. The experimental results exhibited that proposed SNN methods could outperform most of the previously proposed SNNs and ANNs in the same network structure. Among them, the proposed DlaCSNN-BP got the best classification performance. This result indicates that our proposed method can effectively improve the effect of SNN in supervised learning and reduce the gap with deep learning. This work also provides more possibilities for putting SNN into practical application.
引用
收藏
页码:1937 / 1952
页数:15
相关论文
共 50 条
  • [1] Dynamic layer-span connecting spiking neural networks with backpropagation training
    Wang, Zijjian
    Huang, Yuxuan
    Zhu, Yaqin
    Xu, Binxing
    Chen, Long
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (02) : 1937 - 1952
  • [2] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    [J]. FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [3] Training Spiking Neural Networks with Event-driven Backpropagation
    Zhu, Yaoyu
    Yu, Zhaofei
    Fang, Wei
    Xie, Xiaodong
    Huang, Tiejun
    Masquelier, Timothee
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] An Efficient non-Backpropagation Method for Training Spiking Neural Networks
    Guo, Shiqi
    Lin, Tong
    [J]. 2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 192 - 199
  • [5] Spiking Neural Networks Using Backpropagation
    Syed, Tehreem
    Kakani, Vijay
    Cui, Xuenan
    Kim, Hakil
    [J]. 2021 IEEE REGION 10 SYMPOSIUM (TENSYMP), 2021,
  • [6] Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks
    Jin, Yingyezhe
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [7] Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks
    Shen, Guobin
    Zhao, Dongcheng
    Zeng, Yi
    [J]. PATTERNS, 2022, 3 (06):
  • [8] Training multi-layer spiking neural networks using NormAD based spatio-temporal error backpropagation
    Anwani, Navin
    Rajendran, Bipin
    [J]. NEUROCOMPUTING, 2020, 380 : 67 - 77
  • [9] Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Towards Memory- and Time-Efficient Backpropagation for Training Spiking Neural Networks
    Meng, Qingyan
    Xiao, Mingqing
    Yan, Shen
    Wang, Yisen
    Lin, Zhouchen
    Luo, Zhi-Quan
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6143 - 6153