Efficient Spiking Neural Networks With Radix Encoding

被引:5
|
作者
Wang, Zhehui [1 ]
Gu, Xiaozhe [2 ]
Goh, Rick Siow Mong [1 ]
Zhou, Joey Tianyi [1 ]
Luo, Tao [1 ]
机构
[1] ASTAR, Inst High Performance Comp, Singapore 138632, Singapore
[2] Chinese Univ Hong Kong, Future Network Intelligence Inst FNii, Shenzhen 518172, Peoples R China
关键词
Encoding; energy efficient; short spike train; speedup; spiking neural network (SNN); PROCESSOR; DRIVEN;
D O I
10.1109/TNNLS.2022.3195918
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) have advantages in latency and energy efficiency over traditional artificial neural networks (ANNs) due to their event-driven computation mechanism and the replacement of energy-consuming weight multiplication with addition. However, to achieve high accuracy, it usually requires long spike trains to ensure accuracy, usually more than 1000 time steps. This offsets the computation efficiency brought by SNNs because a longer spike train means a larger number of operations and larger latency. In this article, we propose a radix-encoded SNN, which has ultrashort spike trains. Specifically, it is able to use less than six time steps to achieve even higher accuracy than its traditional counterpart. We also develop a method to fit our radix encoding technique into the ANN-to-SNN conversion approach so that we can train radix-encoded SNNs more efficiently on mature platforms and hardware. Experiments show that our radix encoding can achieve 25x improvement in latency and 1.7% improvement in accuracy compared to the state-of-the-art method using the VGG-16 network on the CIFAR-10 dataset.
引用
收藏
页码:3689 / 3701
页数:13
相关论文
共 50 条
  • [1] An Efficient and Perceptually Motivated Auditory Neural Encoding and Decoding Algorithm for Spiking Neural Networks
    Pan, Zihan
    Chua, Yansong
    Wu, Jibin
    Zhang, Malu
    Li, Haizhou
    Ambikairajah, Eliathamby
    FRONTIERS IN NEUROSCIENCE, 2020, 13
  • [2] Efficient learning in spiking neural networks
    Rast, Alexander
    Aoun, Mario Antoine
    Elia, Eleni G.
    Crook, Nigel
    NEUROCOMPUTING, 2024, 597
  • [3] A TIME ENCODING APPROACH TO TRAINING SPIKING NEURAL NETWORKS
    Adam, Karen
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5957 - 5961
  • [4] A Reversibility Analysis of Encoding Methods for Spiking Neural Networks
    Johnson, C.
    Roychowdhury, S.
    Venayagamoorthy, G. K.
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 1802 - 1809
  • [5] Temporal data encoding and SequenceLearning with spiking neural networks
    Fujii, Robert H.
    Oozeki, Kenjyu
    ARTIFICIAL NEURAL NETWORKS - ICANN 2006, PT 1, 2006, 4131 : 780 - 789
  • [6] Neural Encoding and Spike Generation for Spiking Neural Networks implemented in FPGA
    de Oliveira Neto, Jose Rodrigues
    Cerquinho Cajueiro, Joao Paulo
    Ranhel, Joao
    25TH INTERNATIONAL CONFERENCE ON ELECTRONICS, COMMUNICATIONS AND COMPUTERS (CONIELECOMP 2015), 2015, : 55 - 61
  • [7] Identifying Efficient Dataflows for Spiking Neural Networks
    Sharma, Deepika
    Ankit, Aayush
    Roy, Kaushik
    2022 ACM/IEEE INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN, ISLPED 2022, 2022,
  • [8] Efficient Structure Slimming for Spiking Neural Networks
    Li Y.
    Fang X.
    Gao Y.
    Zhou D.
    Shen J.
    Liu J.K.
    Pan G.
    Xu Q.
    IEEE Transactions on Artificial Intelligence, 2024, 5 (08): : 1 - 9
  • [9] Approaches to efficient simulation with spiking neural networks
    Connolly, CG
    Marian, I
    Reilly, RG
    CONNECTIONIST MODELS OF COGNITION AND PERCEPTION II, 2004, 15 : 231 - 240
  • [10] Effective and Efficient Spiking Recurrent Neural Networks
    Yin, Bojian
    Corradi, Federico
    Bohte, Sander
    ERCIM NEWS, 2021, (125): : 9 - 10