Efficient Hardware Implementation of Spiking Neural Networks Using Nonstandard Finite Difference Scheme for Leaky Integrate and Fire Neuron Model

被引:0
|
作者
Reddy, K. Venkateswara [1 ,2 ]
Balaji, N. [1 ]
机构
[1] Jawaharlal Nehru Technol Univ, Dept Elect & Commun Engn, Kakinada, Andhra Pradesh, India
[2] MBTS Govt Polytech, Dept Elect & Commun Engn, Guntur, Andhra Pradesh, India
关键词
Spiking neural network (SNN); Neuromorphic system; leaky integrate and fire (LIF) neuron; neuron hardware block (NHB); pattern recognition; nonstandard finite difference (NSFD); field-programmable gate array (FPGA); DIGITAL MULTIPLIERLESS IMPLEMENTATION; CHIP;
D O I
10.1142/S0218126625500380
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Continuous fire models are not suitable for the implementation of hardware units for applications and hence, suitable discrete versions need to be selected. Moreover, the nonlinear components in the neuronal equations reduce system performance (in the case of frequency and number of resources). This research paper focuses on implementing efficient Spiking Neural Networks (SNNs) using Field-Programmable Gate Array (FPGA), with a specific emphasis on the Leaky Integrate and Fire (LIF) neuron model. Its objective is to optimize the mathematical equations of the LIF model by approximating nonlinear functions. This approach enables the development of a simple, cost-effective and high-speed design. Existing LIF Neuron Hardware Blocks (NHBs) are based on the approximation of continuous models by standard difference schemes such as the Euler method or R-K method etc. Mathematically, such approximations do not exactly represent all dynamics of continuous systems. There are good approximations for small step sizes but they behave oddly when the approximation step size increases. Hence, the corresponding discrete, digital versions are not suitable for applications in all cases. This paper utilizes a Nonstandard Finite Difference (NSFD) scheme for the hardware (FPGA) implementation of the exact model of LIF-based NHB that works for all step sizes. The model presented here has a speed of 438.686MHz which is more than other existing models presented in this paper. It is multiplier-less, unlike earlier models. Further, it is implemented for SNN for basic pattern recognition and established that the proposed model works properly for given patterns. The system was evaluated using large datasets such as MNIST handwritten digit recognition, achieving a classification accuracy of 97.8%. Additionally, it underwent testing for COVID-19 chest CT scan image classification, demonstrating an 84% accuracy rate which is 6% more compared to existing Spiking Neural Networks (SNNs).
引用
收藏
页数:25
相关论文
共 50 条
  • [1] GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks
    Yao, Xingting
    Li, Fanrong
    Mo, Zitao
    Cheng, Jian
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] A Modified Adaptive Exponential Integrate and Fire neuron Model for Circuit Implementation of Spiking Neural Networks
    Gomar, Shaghayegh
    Ahmadi, Arash
    Eskandari, Elahe
    [J]. 2013 21ST IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2013,
  • [3] A Highly Scalable Junctionless FET Leaky Integrate-and-Fire Neuron for Spiking Neural Networks
    Kamal, Neha
    Singh, Jawar
    [J]. IEEE TRANSACTIONS ON ELECTRON DEVICES, 2021, 68 (04) : 1633 - 1638
  • [4] Design of Leaky Integrate and Fire Neuron for Spiking Neural Networks Using Trench Bipolar I-MOS
    Lahgere, Avinash
    [J]. IEEE TRANSACTIONS ON NANOTECHNOLOGY, 2023, 22 : 260 - 265
  • [5] Linear leaky-integrate-and-fire neuron model based spiking neural networks and its mapping relationship to deep neural networks
    Lu, Sijia
    Xu, Feng
    [J]. FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [6] Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networks
    Kim, Youngeun
    Li, Yuhang
    Moitra, Abhishek
    Yin, Ruokai
    Panda, Priyadarshini
    [J]. FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [7] An energy efficient leaky integrate and fire neuron using Ge-source TFET for spiking neural network: simulation analysis
    Tiwari, Shreyas
    Saha, Rajesh
    Varma, Tarun
    [J]. PHYSICA SCRIPTA, 2024, 99 (10)
  • [8] Dynamic threshold integrate and fire neuron model for low latency spiking neural networks
    Wu, Xiyan
    Zhao, Yufei
    Song, Yong
    Jiang, Yurong
    Bai, Yashuo
    Li, Xinyi
    Zhou, Ya
    Yang, Xin
    Hao, Qun
    [J]. NEUROCOMPUTING, 2023, 544
  • [9] Memristive leaky integrate-and-fire neuron and learnable straight-through estimator in spiking neural networks
    Chen, Tao
    She, Chunyan
    Wang, Lidan
    Duan, Shukai
    [J]. COGNITIVE NEURODYNAMICS, 2024,
  • [10] An ultra-compact leaky-integrate-and-fire model for building spiking neural networks
    M. J. Rozenberg
    O. Schneegans
    P. Stoliar
    [J]. Scientific Reports, 9