STCSNN: High energy efficiency spike-train level spiking neural networks with spatio-temporal conversion

被引:0
|
作者
Xu, Changqing [1 ,2 ]
Liu, Yi [2 ]
Yang, Yintang [2 ]
机构
[1] Xidian Univ, Guangzhou Inst Technol, Xian 710071, Peoples R China
[2] Xidian Univ, Sch Microelect, Xian 710071, Peoples R China
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
High energy efficiency; Spike-train level; Spatio-temporal conversion; Spiking neural network;
D O I
10.1016/j.neucom.2024.128364
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Brain-inspired spiking neuron networks (SNNs) have attracted widespread research interest due to their low power features, high biological plausibility, and strong spatiotemporal information processing capability. Although adopting a surrogate gradient (SG) makes the non-differentiability SNN trainable, achieving comparable accuracy for ANNs and keeping low-power features simultaneously is still tricky. In this paper, we proposed an energy-efficient spike-train level spiking neural network with spatio-temporal conversion, which has low computational cost and high accuracy. In the STCSNN, spatio-temporal conversion blocks (STCBs) are proposed to keep the low power features of SNNs and improve accuracy. However, STCSNN cannot adopt backpropagation algorithms directly due to the non-differentiability nature of spike trains. We proposed a suitable learning rule for STCSNNs by deducing the equivalent gradient of STCB. We evaluate the proposed STCSNN on static and neuromorphic datasets, including Fashion-Mnist, Cifar10, Cifar100, TinyImageNet, and DVS-Cifar10. The experiment results show that our proposed STCSNN outperforms the state-of-the-art accuracy on nearly all datasets, using fewer time steps and being highly energy-efficient.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [1] Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [2] Spike-Train Level Unsupervised Learning Algorithm for Deep Spiking Belief Networks
    Lin, Xianghong
    Du, Pangao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 634 - 645
  • [3] Spatio-temporal Representations of Uncertainty in Spiking Neural Networks
    Savin, Cristina
    Deneve, Sophie
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [4] Training spiking neural networks to associate spatio-temporal input-output spike patterns
    Mohemmed, Ammar
    Schliebs, Stefan
    Matsuda, Satoshi
    Kasabov, Nikola
    NEUROCOMPUTING, 2013, 107 : 3 - 10
  • [5] Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks
    Wu, Yujie
    Deng, Lei
    Li, Guoqi
    Zhu, Jun
    Shi, Luping
    FRONTIERS IN NEUROSCIENCE, 2018, 12
  • [6] Convolutional Spiking Neural Networks for Spatio-Temporal Feature Extraction
    Ali Samadzadeh
    Fatemeh Sadat Tabatabaei Far
    Ali Javadi
    Ahmad Nickabadi
    Morteza Haghir Chehreghani
    Neural Processing Letters, 2023, 55 : 6979 - 6995
  • [7] Convolutional Spiking Neural Networks for Spatio-Temporal Feature Extraction
    Samadzadeh, Ali
    Far, Fatemeh Sadat Tabatabaei
    Javadi, Ali
    Nickabadi, Ahmad
    Chehreghani, Morteza Haghir
    NEURAL PROCESSING LETTERS, 2023, 55 (06) : 6979 - 6995
  • [8] Spike-Train Level Direct Feedback Alignment: Sidestepping Backpropagation for On-Chip Training of Spiking Neural Nets
    Lee, Jeongjun
    Zhang, Renqian
    Zhang, Wenrui
    Liu, Yu
    Li, Peng
    FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [9] Acquisition and Representation of Spatio-Temporal Signals in Polychronizing Spiking Neural Networks
    Wang, Felix
    Severa, William M.
    Rothganger, Fred
    PROCEEDINGS OF THE 2019 7TH ANNUAL NEURO-INSPIRED COMPUTATIONAL ELEMENTS WORKSHOP (NICE 2019), 2020,
  • [10] Efficient human activity recognition with spatio-temporal spiking neural networks
    Li, Yuhang
    Yin, Ruokai
    Kim, Youngeun
    Panda, Priyadarshini
    FRONTIERS IN NEUROSCIENCE, 2023, 17