Dynamic layer-span connecting spiking neural networks with backpropagation training

被引:0
|
作者
Zijjian Wang
Yuxuan Huang
Yaqin Zhu
Binxing Xu
Long Chen
机构
[1] Donghua University,Department of Computer Science and Technology
[2] China Telecom Research Institute,undefined
来源
关键词
Spiking neural network; Surrogate gradient; Backpropagation; Brain-inspired computing; Brain-like computing;
D O I
暂无
中图分类号
学科分类号
摘要
Spiking Neural Network (SNN) is one of the mainstream frameworks for brain-like computing and neuromorphic computing, which has the potential to overcome current AI challenges, for example, low-power learning dynamic processes. However, there is still a huge gap in performance between SNN and artificial neural networks (ANN) in traditional supervised learning. One solution for this problem is to propose a better spiking neuron model to improve its memory ability for temporal data. This paper proposed a leaky integrate-and-fire (LIF) neuron model with dynamic postsynaptic potential and a layer-span connecting method for SNN trained using backpropagation. The dynamic postsynaptic potential LIF model allows the neurons dynamically release neurotransmitters in an SNN model, which mimics the activity of biological neurons. The layer-span connecting method enhances the long-distance memory ability of SNN. We also first introduced a cosh-based surrogate gradient for the backpropagation training of SNNs. We compared the SNN with cosh-based surrogate gradient (CSNN), CSNN with dynamic postsynaptic potential (Dyn-CSNN), layer-span connecting CSNN (Las-CSNN), and SNN model with all the proposed methods (DlaCSNN-BP) in three image classification and one text classification datasets. The experimental results exhibited that proposed SNN methods could outperform most of the previously proposed SNNs and ANNs in the same network structure. Among them, the proposed DlaCSNN-BP got the best classification performance. This result indicates that our proposed method can effectively improve the effect of SNN in supervised learning and reduce the gap with deep learning. This work also provides more possibilities for putting SNN into practical application.
引用
收藏
页码:1937 / 1952
页数:15
相关论文
共 50 条
  • [41] Backpropagation Neural Networks Training for Single Trial EEG
    Turnip, Arjon
    Hong, Keum-Shik
    Ge, Shuzhi Sam
    [J]. PROCEEDINGS OF THE 29TH CHINESE CONTROL CONFERENCE, 2010, : 2462 - 2467
  • [42] A modified backpropagation training algorithm for feedforward neural networks
    Kathirvalavakumar, T
    Thangavel, P
    [J]. NEURAL PROCESSING LETTERS, 2006, 23 (02) : 111 - 119
  • [43] Training neural networks: Backpropagation vs genetic algorithms
    Siddique, MNH
    Tokhi, MO
    [J]. IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 2673 - 2678
  • [44] Training multi-layer spiking neural networks with plastic synaptic weights and delays
    Wang, Jing
    [J]. FRONTIERS IN NEUROSCIENCE, 2024, 17
  • [45] CBP-QSNN: Spiking Neural Networks Quantized Using Constrained Backpropagation
    Yoo, Donghyung
    Jeong, Doo Seok
    [J]. IEEE JOURNAL ON EMERGING AND SELECTED TOPICS IN CIRCUITS AND SYSTEMS, 2023, 13 (04) : 1137 - 1146
  • [46] Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks
    Zhang, Malu
    Wang, Jiadong
    Wu, Jibin
    Belatreche, Ammar
    Amornpaisannon, Burin
    Zhang, Zhixuan
    Miriyala, Venkata Pavan Kumar
    Qu, Hong
    Chua, Yansong
    Carlson, Trevor E.
    Li, Haizhou
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 1947 - 1958
  • [47] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [48] Backpropagation-Based Learning Techniques for Deep Spiking Neural Networks: A Survey
    Dampfhoffer, Manon
    Mesquida, Thomas
    Valentian, Alexandre
    Anghel, Lorena
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11906 - 11921
  • [49] Supervised Learning in Multilayer Spiking Neural Networks With Spike Temporal Error Backpropagation
    Luo, Xiaoling
    Qu, Hong
    Wang, Yuchen
    Yi, Zhang
    Zhang, Jilun
    Zhang, Malu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 10141 - 10153
  • [50] Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation
    Comsa, Iulia-Maria
    Potempa, Krzysztof
    Versari, Luca
    Fischbacher, Thomas
    Gesmundo, Andrea
    Alakuijala, Jyrki
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5939 - 5952