Towards Ultra Low Latency Spiking Neural Networks for Vision and Sequential Tasks Using Temporal Pruning

被引:17
|
作者
Chowdhury, Sayeed Shafayet [1 ]
Rathi, Nitin [1 ]
Roy, Kaushik [1 ]
机构
[1] Purdue Univ, W Lafayette, IN 47907 USA
来源
基金
美国国家科学基金会;
关键词
Spiking neural networks; Unit timestep; Energy efficiency; Temporal pruning; Reinforcement learning;
D O I
10.1007/978-3-031-20083-0_42
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) can be energy efficient alternatives to commonly used deep neural networks (DNNs). However, computation over multiple timesteps increases latency and energy and incurs memory access overhead of membrane potentials. Hence, latency reduction is pivotal to obtain SNNs with high energy efficiency. But, reducing latency can have an adverse effect on accuracy. To optimize the accuracy-energy-latency trade-off, we propose a temporal pruning method which starts with an SNN of T timesteps, and reduces T every iteration of training, with threshold and leak as trainable parameters. This results in a continuum of SNNs from T timesteps, all the way up to unit timestep. Training SNNs directly with 1 timestep results in convergence failure due to layerwise spike vanishing and difficulty in finding optimum thresholds. The proposed temporal pruning overcomes this by enabling the learning of suitable layerwise thresholds with backpropagation by maintaining sufficient spiking activity. Using the proposed algorithm, we achieve top-1 accuracy of 93.05%, 70.15% and 69.00% on CIFAR-10, CIFAR-100 and ImageNet, respectively with VGG16, in just 1 timestep. Note, SNNs with leaky-integrate-and-fire (LIF) neurons behave as Recurrent Neural Networks (RNNs), with the membrane potential retaining information of previous inputs. The proposed SNNs also enable performing sequential tasks such as reinforcement learning on Cartpole and Atari pong environments using only 1 to 5 timesteps.
引用
收藏
页码:709 / 726
页数:18
相关论文
共 50 条
  • [31] Using deep neural networks to evaluate object vision tasks in rats
    Vinken, Kasper
    Op de Beeck, Hans
    PLOS COMPUTATIONAL BIOLOGY, 2021, 17 (03)
  • [32] Towards Low-Latency I/O Services for MixedWorkloads Using Ultra-Low Latency SSDs
    Liu, Mingzhe
    Liu, Haikun
    Ye, Chencheng
    Liao, Xiaofei
    Jin, Hai
    Zhang, Yu
    Zheng, Ran
    Hu, Liting
    PROCEEDINGS OF THE 36TH ACM INTERNATIONAL CONFERENCE ON SUPERCOMPUTING, ICS 2022, 2022,
  • [33] Unsupervised Learning Based on Temporal Coding Using STDP in Spiking Neural Networks
    Sun, Congyi
    Chen, Qinyu
    Chen, Kai
    He, Guoqiang
    Fu, Yuxiang
    Li, Li
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2142 - 2146
  • [34] Neuromorphic Temporal Pattern Detection with Spiking Neural Networks using Synaptic Delays
    Bulzomi, Hugo
    Nakano, Yuta
    Bendahan, Remy
    Martinet, Jean
    2024 IEEE 6TH INTERNATIONAL CONFERENCE ON AI CIRCUITS AND SYSTEMS, AICAS 2024, 2024, : 46 - 50
  • [35] Modelling and Analysis of Temporal Gene Expression Data Using Spiking Neural Networks
    Nandini, Durgesh
    Capecci, Elisa
    Koefoed, Lucien
    Lana, Ibai
    Shahi, Gautam Kishore
    Kasabov, Nikola
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 571 - 581
  • [36] Ergodic sequential logic spiking neural network: reproductions of biologically plausible spatio-temporal phenomena and low-power implementation towards neural prosthesis
    Shiomi, Yuta
    Torikai, Hiroyuki
    IEICE ELECTRONICS EXPRESS, 2024, 21 (08):
  • [37] Ergodic sequential logic spiking neural network: reproductions of biologically plausible spatio-temporal phenomena and low-power implementation towards neural prosthesis
    Shiomi, Yuta
    Torikai, Hiroyuki
    IEICE ELECTRONICS EXPRESS, 2024,
  • [38] DCT-SNN: Using DCT to Distribute Spatial Information over Time for Low-Latency Spiking Neural Networks
    Garg, Isha
    Chowdhury, Sayeed Shafayet
    Roy, Kaushik
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 4651 - 4660
  • [39] ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator
    Pei, Yijian
    Xu, Changqing
    Wu, Zili
    Liu, Yi
    Yang, Yintang
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [40] Training High-Performance Low-Latency Spiking Neural Networks by Differentiation on Spike Representation
    Meng, Qingyan
    Xiao, Mingqing
    Yan, Shen
    Wang, Yisen
    Lin, Zhouchen
    Luo, Zhi-Quan
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12434 - 12443