Hybrid Macro/Micro Level Backpropagation for Training Deep Spiking Neural Networks

被引:0
|
作者
Jin, Yingyezhe [1 ]
Zhang, Wenrui [1 ]
Li, Peng [1 ]
机构
[1] Texas A&M Univ, College Stn, TX 77843 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are positioned to enable spatio-temporal information processing and ultra-low power event-driven neuromorphic hardware. However, SNNs are yet to reach the same performances of conventional deep artificial neural networks (ANNs), a long-standing challenge due to complex dynamics and non-differentiable spike events encountered in training. The existing SNN error backpropagation (BP) methods are limited in terms of scalability, lack of proper handling of spiking discontinuities, and/or mismatch between the rate-coded loss function and computed gradient. We present a hybrid macro/micro level backpropagation (HM2-BP) algorithm for training multi-layer SNNs. The temporal effects are precisely captured by the proposed spike-train level post-synaptic potential (S-PSP) at the microscopic level. The rate-coded errors are defined at the macroscopic level, computed and back-propagated across both macroscopic and microscopic levels. Different from existing BP methods, HM2-BP directly computes the gradient of the rate-coded loss function w.r.t tunable parameters. We evaluate the proposed HM2-BP algorithm by training deep fully connected and convolutional SNNs based on the static MNIST [14] and dynamic neuromorphic N-MNIST [26]. HM2-BP achieves an accuracy level of 99.49% and 98.88% for MNIST and N-MNIST, respectively, outperforming the best reported performances obtained from the existing SNN BP algorithms. Furthermore, the HM2-BP produces the highest accuracies based on SNNs for the EMNIST [3] dataset, and leads to high recognition accuracy for the 16-speaker spoken English letters of TI46 Corpus [16], a challenging spatio-temporal speech recognition benchmark for which no prior success based on SNNs was reported. It also achieves competitive performances surpassing those of conventional deep learning models when dealing with asynchronous spiking streams.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    [J]. FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [2] Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [3] Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks
    Shen, Guobin
    Zhao, Dongcheng
    Zeng, Yi
    [J]. PATTERNS, 2022, 3 (06):
  • [4] Training Spiking Neural Networks with Event-driven Backpropagation
    Zhu, Yaoyu
    Yu, Zhaofei
    Fang, Wei
    Xie, Xiaodong
    Huang, Tiejun
    Masquelier, Timothee
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [5] An Efficient non-Backpropagation Method for Training Spiking Neural Networks
    Guo, Shiqi
    Lin, Tong
    [J]. 2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 192 - 199
  • [6] Spiking Neural Networks Using Backpropagation
    Syed, Tehreem
    Kakani, Vijay
    Cui, Xuenan
    Kim, Hakil
    [J]. 2021 IEEE REGION 10 SYMPOSIUM (TENSYMP), 2021,
  • [7] Dynamic layer-span connecting spiking neural networks with backpropagation training
    Zijjian Wang
    Yuxuan Huang
    Yaqin Zhu
    Binxing Xu
    Long Chen
    [J]. Complex & Intelligent Systems, 2024, 10 : 1937 - 1952
  • [8] Dynamic layer-span connecting spiking neural networks with backpropagation training
    Wang, Zijjian
    Huang, Yuxuan
    Zhu, Yaqin
    Xu, Binxing
    Chen, Long
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (02) : 1937 - 1952
  • [9] Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks
    Zhang, Malu
    Wang, Jiadong
    Wu, Jibin
    Belatreche, Ammar
    Amornpaisannon, Burin
    Zhang, Zhixuan
    Miriyala, Venkata Pavan Kumar
    Qu, Hong
    Chua, Yansong
    Carlson, Trevor E.
    Li, Haizhou
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 1947 - 1958
  • [10] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33