Dynamic layer-span connecting spiking neural networks with backpropagation training

被引:0
|
作者
Zijjian Wang
Yuxuan Huang
Yaqin Zhu
Binxing Xu
Long Chen
机构
[1] Donghua University,Department of Computer Science and Technology
[2] China Telecom Research Institute,undefined
来源
关键词
Spiking neural network; Surrogate gradient; Backpropagation; Brain-inspired computing; Brain-like computing;
D O I
暂无
中图分类号
学科分类号
摘要
Spiking Neural Network (SNN) is one of the mainstream frameworks for brain-like computing and neuromorphic computing, which has the potential to overcome current AI challenges, for example, low-power learning dynamic processes. However, there is still a huge gap in performance between SNN and artificial neural networks (ANN) in traditional supervised learning. One solution for this problem is to propose a better spiking neuron model to improve its memory ability for temporal data. This paper proposed a leaky integrate-and-fire (LIF) neuron model with dynamic postsynaptic potential and a layer-span connecting method for SNN trained using backpropagation. The dynamic postsynaptic potential LIF model allows the neurons dynamically release neurotransmitters in an SNN model, which mimics the activity of biological neurons. The layer-span connecting method enhances the long-distance memory ability of SNN. We also first introduced a cosh-based surrogate gradient for the backpropagation training of SNNs. We compared the SNN with cosh-based surrogate gradient (CSNN), CSNN with dynamic postsynaptic potential (Dyn-CSNN), layer-span connecting CSNN (Las-CSNN), and SNN model with all the proposed methods (DlaCSNN-BP) in three image classification and one text classification datasets. The experimental results exhibited that proposed SNN methods could outperform most of the previously proposed SNNs and ANNs in the same network structure. Among them, the proposed DlaCSNN-BP got the best classification performance. This result indicates that our proposed method can effectively improve the effect of SNN in supervised learning and reduce the gap with deep learning. This work also provides more possibilities for putting SNN into practical application.
引用
收藏
页码:1937 / 1952
页数:15
相关论文
共 50 条
  • [21] AN ADAPTIVE TRAINING ALGORITHM FOR BACKPROPAGATION NEURAL NETWORKS
    HSIN, HC
    LI, CC
    SUN, MG
    SCLABASSI, RJ
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1995, 25 (03): : 512 - 514
  • [22] A stochastic backpropagation algorithm for training neural networks
    Chen, YQ
    Yin, T
    Babri, HA
    [J]. ICICS - PROCEEDINGS OF 1997 INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATIONS AND SIGNAL PROCESSING, VOLS 1-3: THEME: TRENDS IN INFORMATION SYSTEMS ENGINEERING AND WIRELESS MULTIMEDIA COMMUNICATIONS, 1997, : 703 - 707
  • [23] IMPROVEMENT OF THE BACKPROPAGATION ALGORITHM FOR TRAINING NEURAL NETWORKS
    LEONARD, J
    KRAMER, MA
    [J]. COMPUTERS & CHEMICAL ENGINEERING, 1990, 14 (03) : 337 - 341
  • [24] Training Delays in Spiking Neural Networks
    State, Laura
    Aceituno, Pau Vilimelis
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 713 - 717
  • [25] Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2020, 30 (06)
  • [26] A remark on the error-backpropagation learning algorithm for spiking neural networks
    Yang, Jie
    Yang, Wenyu
    Wu, Wei
    [J]. APPLIED MATHEMATICS LETTERS, 2012, 25 (08) : 1118 - 1120
  • [27] Recurrent spiking neural network with dynamic presynaptic currents based on backpropagation
    Wang, Zijian
    Zhang, Yanting
    Shi, Haibo
    Cao, Lei
    Yan, Cairong
    Xu, Guangwei
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (03) : 2242 - 2265
  • [28] BPSpike II: A New Backpropagation Learning Algorithm for Spiking Neural Networks
    Matsuda, Satoshi
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 56 - 65
  • [29] Direct Training via Backpropagation for Ultra-Low-Latency Spiking Neural Networks with Multi-Threshold
    Xu, Changqing
    Liu, Yi
    Chen, Dongdong
    Yang, Yintang
    [J]. SYMMETRY-BASEL, 2022, 14 (09):
  • [30] Dynamic Spiking Graph Neural Networks
    Yin, Nan
    Wang, Mengzhu
    Chen, Zhenghan
    De Masi, Giulia
    Xiong, Huan
    Gu, Bin
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16495 - 16503