Efficient Spiking Neural Networks With Logarithmic Temporal Coding

被引:6
|
作者
Zhang, Ming [1 ]
Gu, Zonghua [2 ]
Zheng, Nenggan [3 ]
Ma, De [1 ,4 ]
Pan, Gang [1 ,4 ]
机构
[1] Zhejiang Univ, Coll Comp Sci, Hangzhou 310027, Peoples R China
[2] Umea Univ, Dept Appl Phys & Elect, S-90187 Umea, Sweden
[3] Zhejiang Univ, Qiushi Acad Adv Studies, Hangzhou 310027, Peoples R China
[4] Zhejiang Lab, Hangzhou 311121, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
关键词
Encoding; Training; Biological information theory; Biological neural networks; Computational modeling; Spiking neural networks; temporal coding; neuromorphic computing;
D O I
10.1109/ACCESS.2020.2994360
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A Spiking Neural Network (SNN) can be trained indirectly by first training an Artificial Neural Network (ANN) with the conventional backpropagation algorithm, then converting it into an equivalent SNN. To reduce the computational cost of the resulting SNN as measured by the number of spikes, we present Logarithmic Temporal Coding (LTC), where the number of spikes used to encode an activation grows logarithmically with the activation value; and the accompanying Exponentiate-and-Fire (EF) neuron model, which only involves efficient bit-shift and addition operations. Moreover, we improve the training process of ANN to compensate for approximation errors due to LTC. Experimental results indicate that the resulting SNN achieves competitive performance in terms of classification accuracy at significantly lower computational cost than related work.
引用
收藏
页码:98156 / 98167
页数:12
相关论文
共 50 条
  • [31] Identifying Efficient Dataflows for Spiking Neural Networks
    Sharma, Deepika
    Ankit, Aayush
    Roy, Kaushik
    [J]. 2022 ACM/IEEE INTERNATIONAL SYMPOSIUM ON LOW POWER ELECTRONICS AND DESIGN, ISLPED 2022, 2022,
  • [32] Efficient Structure Slimming for Spiking Neural Networks
    Li, Yaxin
    Fang, Xuanye
    Gao, Yuyuan
    Zhou, Dongdong
    Shen, Jiangrong
    Liu, Jian K.
    Pan, Gang
    Xu, Qi
    [J]. IEEE Transactions on Artificial Intelligence, 2024, 5 (08): : 3823 - 3831
  • [33] Efficient Spiking Neural Networks With Radix Encoding
    Wang, Zhehui
    Gu, Xiaozhe
    Goh, Rick Siow Mong
    Zhou, Joey Tianyi
    Luo, Tao
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (03) : 3689 - 3701
  • [34] Approaches to efficient simulation with spiking neural networks
    Connolly, CG
    Marian, I
    Reilly, RG
    [J]. CONNECTIONIST MODELS OF COGNITION AND PERCEPTION II, 2004, 15 : 231 - 240
  • [35] Effective and Efficient Spiking Recurrent Neural Networks
    Yin, Bojian
    Corradi, Federico
    Bohte, Sander
    [J]. ERCIM NEWS, 2021, (125): : 9 - 10
  • [36] A Hybrid Neural Coding Approach for Pattern Recognition With Spiking Neural Networks
    Chen, Xinyi
    Yang, Qu
    Wu, Jibin
    Li, Haizhou
    Tan, Kay Chen
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 3064 - 3078
  • [37] Hebbian learning in networks of spiking neurons using temporal coding
    Ruf, B
    Schmitt, M
    [J]. BIOLOGICAL AND ARTIFICIAL COMPUTATION: FROM NEUROSCIENCE TO TECHNOLOGY, 1997, 1240 : 380 - 389
  • [38] TEMPORAL CODING IN REALISTIC NEURAL NETWORKS
    GERASYUTA, SM
    IVANOV, DV
    [J]. JOURNAL DE PHYSIQUE I, 1995, 5 (10): : 1367 - 1374
  • [39] Temporal Effective Batch Normalization in Spiking Neural Networks
    Duan, Chaoteng
    Ding, Jianhao
    Chen, Shiyan
    Yu, Zhaofei
    Huang, Tiejun
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [40] RATE CODING OR DIRECT CODING: WHICH ONE IS BETTER FOR ACCURATE, ROBUST, AND ENERGY-EFFICIENT SPIKING NEURAL NETWORKS?
    Kim, Youngeun
    Park, Hyoungseob
    Moitra, Abhishek
    Bhattacharjee, Abhiroop
    Venkatesha, Yeshwanth
    Panda, Priyadarshini
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 71 - 75