SpikeConverter: An Efficient Conversion Framework Zipping the Gap between Artificial Neural Networks and Spiking Neural Networks

被引:0
|
作者
Liu, Fangxin [1 ,2 ]
Zhao, Wenbo [1 ,2 ]
Chen, Yongbiao [1 ]
Wang, Zongwu [1 ]
Jiang, Li [1 ,2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[2] Shanghai Qi Zhi Inst, Shanghai, Peoples R China
[3] Shanghai Jiao Tong Univ, AI Inst, Key Lab Artificial Intelligence, MoE, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) have recently attracted enormous research interest since their event-driven and brain-inspired structure enables low-power computation. In image recognition tasks, the best results achieved by SNN so far utilize ANN-SNN conversion methods that replace activation functions in artificial neural networks (ANNs) with integrate-and-fire neurons. Compared to source ANNs, converted SNNs usually suffer from accuracy loss and require a considerable number of time steps to achieve competitive accuracy. We find that the performance degradation of converted SNN stems from the fact that the information capacity of spike trains in transferred networks is smaller than that of activation values in source ANN, resulting in less information being passed during SNN inference. To better correlate ANN and SNN for better performance, we propose a conversion framework to mitigate the gap between the activation value of source ANN and the generated spike train of target SNN. The conversion framework originates from exploring an identical relation in the conversion and exploits temporal separation scheme and novel neuron model for the relation to hold. We demonstrate almost lossless ANN-SNN conversion using SpikeConverter for a wide variety of networks on challenging datasets including CIFAR-10, CIFAR-100, and ImageNet. Our results also show that SpikeConverter achieves the abovementioned accuracy across different network architectures and datasets using 32X - 512X fewer inference time-steps than state-of-the-art ANN-SNN conversion methods.
引用
收藏
页码:1692 / 1701
页数:10
相关论文
共 50 条
  • [1] BSNN: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons
    Li, Yang
    Zhao, Dongcheng
    Zeng, Yi
    [J]. FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [2] Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware
    Diehl, Peter U.
    Zarrella, Guido
    Cassidy, Andrew
    Pedroni, Bruno U.
    Neftci, Emre
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON REBOOTING COMPUTING (ICRC), 2016,
  • [3] Building a bridge between spiking and artificial neural networks
    Kaiser, Florian
    Feldbusch, Fridtjof
    [J]. ARTIFICIAL NEURAL NETWORKS - ICANN 2007, PT 1, PROCEEDINGS, 2007, 4668 : 380 - +
  • [4] Dataset Conversion for Spiking Neural Networks
    Sadovsky, Erik
    Jakubec, Maros
    Jarinova, Darina
    Jarina, Roman
    [J]. 2023 33RD INTERNATIONAL CONFERENCE RADIOELEKTRONIKA, RADIOELEKTRONIKA, 2023,
  • [5] Learning spiking neuronal networks with artificial neural networks: neural oscillations
    Zhang, Ruilin
    Wang, Zhongyi
    Wu, Tianyi
    Cai, Yuhang
    Tao, Louis
    Xiao, Zhuo-Cheng
    Li, Yao
    [J]. JOURNAL OF MATHEMATICAL BIOLOGY, 2024, 88 (06)
  • [6] An efficient automated parameter tuning framework for spiking neural networks
    Carlson, Kristofor D.
    Nageswaran, Jayram Moorkanikara
    Dutt, Nikil
    Krichmar, Jeffrey L.
    [J]. FRONTIERS IN NEUROSCIENCE, 2014, 8
  • [7] Efficient learning in spiking neural networks
    Rast, Alexander
    Aoun, Mario Antoine
    Elia, Eleni G.
    Crook, Nigel
    [J]. NEUROCOMPUTING, 2024, 597
  • [8] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Yihao Luo
    Haibo Shen
    Xiang Cao
    Tianjiang Wang
    Qi Feng
    Zehan Tan
    [J]. Neural Computing and Applications, 2022, 34 : 9967 - 9982
  • [9] Conversion of Siamese networks to spiking neural networks for energy-efficient object tracking
    Luo, Yihao
    Shen, Haibo
    Cao, Xiang
    Wang, Tianjiang
    Feng, Qi
    Tan, Zehan
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (12): : 9967 - 9982
  • [10] Combining Spiking Neural Networks with Artificial Neural Networks for Enhanced Image Classification
    Muramatsu, Naoya
    Yu, Hai-Tao
    Satoh, Tetsuji
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2023, E106D (02) : 252 - 261