High-performance deep spiking neural networks via at-most-two-spike exponential coding

被引:0
|
作者
Chen, Yunhua [1 ]
Feng, Ren [1 ]
Xiong, Zhimin [1 ]
Xiao, Jinsheng [2 ]
Liu, Jian K. [3 ]
机构
[1] Guangdong Univ Technol, Sch Comp Sci & Technol, Guangzhou, Guangdong, Peoples R China
[2] Wuhan Univ, Sch Elect Informat, Wuhan, Peoples R China
[3] Univ Birmingham, Sch Comp Sci, Birmingham, England
关键词
Deep spiking neural networks; ANN-SNN conversion; Time-based coding;
D O I
10.1016/j.neunet.2024.106346
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) provide necessary models and algorithms for neuromorphic computing. A popular way of building high-performance deep SNNs is to convert ANNs to SNNs, taking advantage of advanced and well -trained ANNs. Here we propose an ANN to SNN conversion methodology that uses a time -based coding scheme, named At -most -two -spike Exponential Coding (AEC), and a corresponding AEC spiking neuron model for ANN-SNN conversion. AEC neurons employ quantization -compensating spikes to improve coding accuracy and capacity, with each neuron generating up to two spikes within the time window. Two exponential decay functions with tunable parameters are proposed to represent the dynamic encoding thresholds, based on which pixel intensities are encoded into spike times and spike times are decoded into pixel intensities. The hyper -parameters of AEC neurons are fine-tuned based on the loss function of SNN-decoded values and ANN -activation values. In addition, we design two regularization terms for the number of spikes, providing the possibility to achieve the best trade-off between accuracy, latency and power consumption. The experimental results show that, compared to other similar methods, the proposed scheme not only obtains deep SNNs with higher accuracy, but also has more significant advantages in terms of energy efficiency and inference latency. More details can be found at https://github.com/RPDS2020/AEC.git.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Supervised Learning Algorithm Based on Spike Train Inner Product for Deep Spiking Neural Networks
    Lin, Xianghong
    Zhang, Zhen
    Zheng, Donghao
    BRAIN SCIENCES, 2023, 13 (02)
  • [32] A Semi-Supervised Multi-Spike Learning Algorithm for Deep Spiking Neural Networks
    Zhao, Xiaoman
    Lin, Xianghong
    Zhang, Zhen
    2023 IEEE 13TH ANNUAL COMPUTING AND COMMUNICATION WORKSHOP AND CONFERENCE, CCWC, 2023, : 976 - 982
  • [33] Spikeformer: Training high-performance spiking neural network with transformer
    Li, Yudong
    Lei, Yunlin
    Yang, Xu
    NEUROCOMPUTING, 2024, 574
  • [34] Composition of Deep and Spiking Neural Networks for Very Low Bit Rate Speech Coding
    Cernak, Milos
    Lazaridis, Alexandros
    Asaei, Afsaneh
    Garner, Philip N.
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2016, 24 (12) : 2301 - 2312
  • [35] High-performance stock index trading via neural networks and trees
    Chalvatzis, Chariton
    Hristu-Varsakelis, Dimitrios
    APPLIED SOFT COMPUTING, 2020, 96
  • [36] High-performance simulation of neural networks
    Rademacher, TJ
    Lumpp, JE
    1997 IEEE AEROSPACE CONFERENCE PROCEEDINGS, VOL 3, 1997, : 401 - 413
  • [37] Decoding spatiotemporal spike sequences via the finite state automata dynamics of spiking neural networks
    Jin, Dezhe Z.
    NEW JOURNAL OF PHYSICS, 2008, 10
  • [38] Data clustering via spiking neural networks through spike timing-dependent plasticity
    Tao, XL
    Michel, HE
    IC-AI '04 & MLMTA'04 , VOL 1 AND 2, PROCEEDINGS, 2004, : 168 - 173
  • [39] CCS Coding of Discharge Diagnoses via Deep Neural Networks
    Helwe, Chadi
    Elbassuoni, Shady
    Geha, Mirabelle
    Hitti, Eveline
    Obermeyer, Carla Makhlouf
    PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON DIGITAL HEALTH (DH'17), 2017, : 175 - 179
  • [40] High-performance VLSI design for convolution layer of deep learning neural networks
    Zeng J.-L.
    Chen K.-H.
    Wang J.-Y.
    International Journal of Electrical Engineering, 2019, 26 (05): : 195 - 202