FSpiNN: An Optimization Framework for Memory-Efficient and Energy-Efficient Spiking Neural Networks

被引:31
|
作者
Putra, Rachmad Vidya Wicaksana [1 ]
Shafique, Muhammad [1 ,2 ]
机构
[1] Tech Univ Wien, Inst Comp Engn, A-1040 Vienna, Austria
[2] New York Univ Abu Dhabi, Div Engn, Abu Dhabi, U Arab Emirates
关键词
Adaptivity; framework; edge devices; embedded systems; energy efficiency; memory; optimization; spiking neural networks (SNNs); spike-timing-dependent plasticity (STDP); unsupervised learning; NEURONS;
D O I
10.1109/TCAD.2020.3013049
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Spiking neural networks (SNNs) are gaining interest due to their event-driven processing which potentially consumes low-power/energy computations in hardware platforms while offering unsupervised learning capability due to the spike-timing-dependent plasticity (STDP) rule. However, state-of-the-art SNNs require a large memory footprint to achieve high accuracy, thereby making them difficult to be deployed on embedded systems, for instance, on battery-powered mobile devices and IoT Edge nodes. Toward this, we propose FSpiNN, an optimization framework for obtaining memory-efficient and energy-efficient SNNs for training and inference processing, with unsupervised learning capability while maintaining accuracy. It is achieved by: 1) reducing the computational requirements of neuronal and STDP operations; 2) improving the accuracy of STDP-based learning; 3) compressing the SNN through a fixed-point quantization; and 4) incorporating the memory and energy requirements in the optimization process. FSpiNN reduces the computational requirements by reducing the number of neuronal operations, the STDP-based synaptic weight updates, and the STDP complexity. To improve the accuracy of learning, FSpiNN employs timestep-based synaptic weight updates and adaptively determines the STDP potentiation factor and the effective inhibition strength. The experimental results show that as compared to the state-of-the-art work, FSpiNN achieves 7.5x memory saving, and improves the energy efficiency by 3.5x on average for training and by 1.8x on average for inference, across MNIST and Fashion MNIST datasets, with no accuracy loss for a network with 4900 excitatory neurons, thereby enabling energy-efficient SNNs for edge devices/embedded systems.
引用
收藏
页码:3601 / 3613
页数:13
相关论文
共 50 条
  • [1] Memory-Efficient Reversible Spiking Neural Networks
    Zhang, Hong
    Zhang, Yu
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 16759 - 16767
  • [2] BitSNNs: Revisiting Energy-efficient Spiking Neural Networks
    Hu Y.
    Zheng Q.
    Pan G.
    [J]. IEEE Transactions on Cognitive and Developmental Systems, 2024, 16 (05) : 1 - 12
  • [3] AutoSNN: Towards Energy-Efficient Spiking Neural Networks
    Na, Byunggook
    Mok, Jisoo
    Park, Seongsik
    Lee, Dongjin
    Choe, Hyeokjun
    Yoon, Sungroh
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [4] Dynamic Spike Bundling for Energy-Efficient Spiking Neural Networks
    Krithivasan, Sarada
    Sen, Sanchari
    Venkataramani, Swagath
    Raghunathan, Anand
    [J]. 2019 IEEE/ACM INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN (ISLPED), 2019,
  • [5] Towards Energy-Efficient Sentiment Classification with Spiking Neural Networks
    Chen, Junhao
    Ye, Xiaojun
    Sun, Jingbo
    Li, Chao
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PART X, 2023, 14263 : 518 - 529
  • [6] Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networks
    Kim, Youngeun
    Li, Yuhang
    Moitra, Abhishek
    Yin, Ruokai
    Panda, Priyadarshini
    [J]. FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [7] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Luo, Yihao
    Shen, Haibo
    Cao, Xiang
    Wang, Tianjiang
    Feng, Qi
    Tan, Zehan
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (12): : 9967 - 9982
  • [8] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Yihao Luo
    Haibo Shen
    Xiang Cao
    Tianjiang Wang
    Qi Feng
    Zehan Tan
    [J]. Neural Computing and Applications, 2022, 34 : 9967 - 9982
  • [9] TopSpark: A Timestep Optimization Methodology for Energy-Efficient Spiking Neural Networks on Autonomous Mobile Agents
    Putra, Rachmad Vidya Wicaksana
    Shafique, Muhammad
    [J]. 2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 3561 - 3567
  • [10] Spiking Deep Convolutional Neural Networks for Energy-Efficient Object Recognition
    Yongqiang Cao
    Yang Chen
    Deepak Khosla
    [J]. International Journal of Computer Vision, 2015, 113 : 54 - 66