High-performance deep spiking neural networks via at-most-two-spike exponential coding

被引:0
|
作者
Chen, Yunhua [1 ]
Feng, Ren [1 ]
Xiong, Zhimin [1 ]
Xiao, Jinsheng [2 ]
Liu, Jian K. [3 ]
机构
[1] Guangdong Univ Technol, Sch Comp Sci & Technol, Guangzhou, Guangdong, Peoples R China
[2] Wuhan Univ, Sch Elect Informat, Wuhan, Peoples R China
[3] Univ Birmingham, Sch Comp Sci, Birmingham, England
关键词
Deep spiking neural networks; ANN-SNN conversion; Time-based coding;
D O I
10.1016/j.neunet.2024.106346
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) provide necessary models and algorithms for neuromorphic computing. A popular way of building high-performance deep SNNs is to convert ANNs to SNNs, taking advantage of advanced and well -trained ANNs. Here we propose an ANN to SNN conversion methodology that uses a time -based coding scheme, named At -most -two -spike Exponential Coding (AEC), and a corresponding AEC spiking neuron model for ANN-SNN conversion. AEC neurons employ quantization -compensating spikes to improve coding accuracy and capacity, with each neuron generating up to two spikes within the time window. Two exponential decay functions with tunable parameters are proposed to represent the dynamic encoding thresholds, based on which pixel intensities are encoded into spike times and spike times are decoded into pixel intensities. The hyper -parameters of AEC neurons are fine-tuned based on the loss function of SNN-decoded values and ANN -activation values. In addition, we design two regularization terms for the number of spikes, providing the possibility to achieve the best trade-off between accuracy, latency and power consumption. The experimental results show that, compared to other similar methods, the proposed scheme not only obtains deep SNNs with higher accuracy, but also has more significant advantages in terms of energy efficiency and inference latency. More details can be found at https://github.com/RPDS2020/AEC.git.
引用
收藏
页数:9
相关论文
共 50 条
  • [21] Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding
    Kim, Youngeun
    Kahana, Adar
    Yin, Ruokai
    Li, Yuhang
    Stinis, Panos
    Karniadakis, George Em
    Panda, Priyadarshini
    FRONTIERS IN NEUROSCIENCE, 2024, 18
  • [22] The effect of an exogenous alternating magnetic field on neural coding in deep spiking neural networks
    Guo, Lei
    Zhang, Wei
    Zhang, Jialei
    JOURNAL OF INTEGRATIVE NEUROSCIENCE, 2018, 17 (02) : 97 - 104
  • [23] SpykeTorch: Efficient Simulation of Convolutional Spiking Neural Networks With at Most One Spike per Neuron
    Mozafari, Milad
    Ganjtabesh, Mohammad
    Nowzari-Dalini, Abbas
    Masquelier, Timothee
    FRONTIERS IN NEUROSCIENCE, 2019, 13
  • [24] Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [25] Reducing the spike rate of deep spiking neural networks based on time-encoding
    Fontanini, Riccardo
    Pilotto, Alessandro
    Esseni, David
    Loghi, Mirko
    NEUROMORPHIC COMPUTING AND ENGINEERING, 2024, 4 (03):
  • [26] Spiking Neural Networks on High Performance Computer Clusters
    Chen, Chong
    Taha, Tarek M.
    OPTICS AND PHOTONICS FOR INFORMATION PROCESSING V, 2011, 8134
  • [27] Spike-Thrift: Towards Energy-Efficient Deep Spiking Neural Networks by Limiting Spiking Activity via Attention-Guided Compression
    Kundu, Souvik
    Datta, Gourav
    Pedram, Massoud
    Beerel, Peter A.
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3952 - 3961
  • [28] Deciphering the Feature Representation of Deep Neural Networks for High-Performance AI
    Islam, Md Tauhidul
    Xing, Lei
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (08) : 5273 - 5287
  • [29] Modularity Facilitates Classification Performance of Spiking Neural Networks for Decoding Cortical Spike Trains
    Liu, Tengjun
    Ning, Yuxiao
    Liu, Pengfu
    Zhang, Yiwei
    Chua, Yansong
    Chen, Weidong
    Zhang, Shaomin
    2023 45TH ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY, EMBC, 2023,
  • [30] Sparse-firing regularization methods for spiking neural networks with time-to-first-spike coding
    Sakemi, Yusuke
    Yamamoto, Kakei
    Hosomi, Takeo
    Aihara, Kazuyuki
    SCIENTIFIC REPORTS, 2023, 13 (01)