High-performance deep spiking neural networks via at-most-two-spike exponential coding

被引:0
|
作者
Chen, Yunhua [1 ]
Feng, Ren [1 ]
Xiong, Zhimin [1 ]
Xiao, Jinsheng [2 ]
Liu, Jian K. [3 ]
机构
[1] Guangdong Univ Technol, Sch Comp Sci & Technol, Guangzhou, Guangdong, Peoples R China
[2] Wuhan Univ, Sch Elect Informat, Wuhan, Peoples R China
[3] Univ Birmingham, Sch Comp Sci, Birmingham, England
关键词
Deep spiking neural networks; ANN-SNN conversion; Time-based coding;
D O I
10.1016/j.neunet.2024.106346
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) provide necessary models and algorithms for neuromorphic computing. A popular way of building high-performance deep SNNs is to convert ANNs to SNNs, taking advantage of advanced and well -trained ANNs. Here we propose an ANN to SNN conversion methodology that uses a time -based coding scheme, named At -most -two -spike Exponential Coding (AEC), and a corresponding AEC spiking neuron model for ANN-SNN conversion. AEC neurons employ quantization -compensating spikes to improve coding accuracy and capacity, with each neuron generating up to two spikes within the time window. Two exponential decay functions with tunable parameters are proposed to represent the dynamic encoding thresholds, based on which pixel intensities are encoded into spike times and spike times are decoded into pixel intensities. The hyper -parameters of AEC neurons are fine-tuned based on the loss function of SNN-decoded values and ANN -activation values. In addition, we design two regularization terms for the number of spikes, providing the possibility to achieve the best trade-off between accuracy, latency and power consumption. The experimental results show that, compared to other similar methods, the proposed scheme not only obtains deep SNNs with higher accuracy, but also has more significant advantages in terms of energy efficiency and inference latency. More details can be found at https://github.com/RPDS2020/AEC.git.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Spike Attention Coding for Spiking Neural Networks
    Liu, Jiawen
    Hu, Yifan
    Li, Guoqi
    Pei, Jing
    Deng, Lei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 35 (12) : 1 - 7
  • [2] Gated Attention Coding for Training High-Performance and Efficient Spiking Neural Networks
    Qiu, Xuerui
    Zhu, Rui-Jie
    Chou, Yuhong
    Wang, Zhaorui
    Deng, Liang-Jian
    Li, Guoqi
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 1, 2024, : 601 - 610
  • [3] High-performance deep spiking neural networks with 0.3 spikes per neuron
    Stanojevic, Ana
    Wozniak, Stanislaw
    Bellec, Guillaume
    Cherubini, Giovanni
    Pantazi, Angeliki
    Gerstner, Wulfram
    NATURE COMMUNICATIONS, 2024, 15 (01)
  • [4] Training High-Performance Low-Latency Spiking Neural Networks by Differentiation on Spike Representation
    Meng, Qingyan
    Xiao, Mingqing
    Yan, Shen
    Wang, Yisen
    Lin, Zhouchen
    Luo, Zhi-Quan
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 12434 - 12443
  • [5] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [6] Reducing the Spike Rate in Deep Spiking Neural Networks
    Fontanini, Riccardo
    Esseni, David
    Loghi, Mirko
    PROCEEDINGS OF INTERNATIONAL CONFERENCE ON NEUROMORPHIC SYSTEMS 2022, ICONS 2022, 2022,
  • [7] Direct training high-performance deep spiking neural networks: a review of theories and methods
    Zhou, Chenlin
    Zhang, Han
    Yu, Liutao
    Ye, Yumin
    Zhou, Zhaokun
    Huang, Liwei
    Ma, Zhengyu
    Fan, Xiaopeng
    Zhou, Huihui
    Tian, Yonghong
    FRONTIERS IN NEUROSCIENCE, 2024, 18
  • [8] A HYBRID LEARNING FRAMEWORK FOR DEEP SPIKING NEURAL NETWORKS WITH ONE-SPIKE TEMPORAL CODING
    Wang, Jiadong
    Wu, Jibin
    Zhang, Malu
    Liu, Qi
    Li, Haizhou
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8942 - 8946
  • [9] Temporal Pattern Coding in Deep Spiking Neural Networks
    Rueckauer, Bodo
    Liu, Shih-Chii
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] T2FSNN: Deep Spiking Neural Networks with Time-to-first-spike Coding
    Park, Seongsik
    Kim, Seijoon
    Na, Byunggook
    Yoon, Sungroh
    PROCEEDINGS OF THE 2020 57TH ACM/EDAC/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2020,