Efficient Spiking Neural Networks With Radix Encoding

被引:5
|
作者
Wang, Zhehui [1 ]
Gu, Xiaozhe [2 ]
Goh, Rick Siow Mong [1 ]
Zhou, Joey Tianyi [1 ]
Luo, Tao [1 ]
机构
[1] ASTAR, Inst High Performance Comp, Singapore 138632, Singapore
[2] Chinese Univ Hong Kong, Future Network Intelligence Inst FNii, Shenzhen 518172, Peoples R China
关键词
Encoding; energy efficient; short spike train; speedup; spiking neural network (SNN); PROCESSOR; DRIVEN;
D O I
10.1109/TNNLS.2022.3195918
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) have advantages in latency and energy efficiency over traditional artificial neural networks (ANNs) due to their event-driven computation mechanism and the replacement of energy-consuming weight multiplication with addition. However, to achieve high accuracy, it usually requires long spike trains to ensure accuracy, usually more than 1000 time steps. This offsets the computation efficiency brought by SNNs because a longer spike train means a larger number of operations and larger latency. In this article, we propose a radix-encoded SNN, which has ultrashort spike trains. Specifically, it is able to use less than six time steps to achieve even higher accuracy than its traditional counterpart. We also develop a method to fit our radix encoding technique into the ANN-to-SNN conversion approach so that we can train radix-encoded SNNs more efficiently on mature platforms and hardware. Experiments show that our radix encoding can achieve 25x improvement in latency and 1.7% improvement in accuracy compared to the state-of-the-art method using the VGG-16 network on the CIFAR-10 dataset.
引用
收藏
页码:3689 / 3701
页数:13
相关论文
共 50 条
  • [21] EXODUS: Stable and efficient training of spiking neural networks
    Bauer, Felix C.
    Lenz, Gregor
    Haghighatshoar, Saeid
    Sheik, Sadique
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [22] A Resource-efficient Spiking Neural Network Accelerator Supporting Emerging Neural Encoding
    Gerlinghoff, Daniel
    Wang, Zhehui
    Gu, Xiaozhe
    Goh, Rick Siow Mong
    Luo, Tao
    PROCEEDINGS OF THE 2022 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2022), 2022, : 92 - 95
  • [23] Spike Trains Encoding Optimization for Spiking Neural Networks Implementation in FPGA
    Fang, Biao
    Zhang, Yuhao
    Yan, Rui
    Tang, Huajin
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 412 - 418
  • [24] Selection and Optimization of Temporal Spike Encoding Methods for Spiking Neural Networks
    Petro, Balint
    Kasabov, Nikola
    Kiss, Rita M.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (02) : 358 - 370
  • [25] Neural Dynamics Pruning for Energy-Efficient Spiking Neural Networks
    Huang, Haoyu
    He, Linxuan
    Liu, Faqiang
    Zhao, Rong
    Shi, Luping
    2024 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME 2024, 2024,
  • [26] Encoding of Input Signals in Terms of Path Complexes in Spiking Neural Networks
    V. A. Ilyin
    Ya. P. Ivina
    M. Yu. Khristichenko
    A. V. Serenko
    R. B. Rybka
    Moscow University Physics Bulletin, 2024, 79 (Suppl 2) : S630 - S638
  • [27] Training Energy-Efficient Deep Spiking Neural Networks with Single-Spike Hybrid Input Encoding
    Datta, Gourav
    Kundu, Souvik
    Beerel, Peter A.
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [28] Efficient Processing of Spiking Neural Networks via Task Specialization
    Abu Lebdeh, Muath
    Yildirim, Kasim Sinan
    Brunelli, Davide
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (05): : 3603 - 3613
  • [29] Efficient and Robust Supervised Learning Algorithm for Spiking Neural Networks
    Zhang Y.
    Geng T.
    Zhang M.
    Wu X.
    Zhou J.
    Qu H.
    Sensing and Imaging, 2018, 19 (1):
  • [30] BitSNNs: Revisiting Energy-Efficient Spiking Neural Networks
    Hu, Yangfan
    Zheng, Qian
    Pan, Gang
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2024, 16 (05) : 1736 - 1747