SpikeConverter: An Efficient Conversion Framework Zipping the Gap between Artificial Neural Networks and Spiking Neural Networks

被引:0
|
作者
Liu, Fangxin [1 ,2 ]
Zhao, Wenbo [1 ,2 ]
Chen, Yongbiao [1 ]
Wang, Zongwu [1 ]
Jiang, Li [1 ,2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[2] Shanghai Qi Zhi Inst, Shanghai, Peoples R China
[3] Shanghai Jiao Tong Univ, AI Inst, Key Lab Artificial Intelligence, MoE, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking Neural Networks (SNNs) have recently attracted enormous research interest since their event-driven and brain-inspired structure enables low-power computation. In image recognition tasks, the best results achieved by SNN so far utilize ANN-SNN conversion methods that replace activation functions in artificial neural networks (ANNs) with integrate-and-fire neurons. Compared to source ANNs, converted SNNs usually suffer from accuracy loss and require a considerable number of time steps to achieve competitive accuracy. We find that the performance degradation of converted SNN stems from the fact that the information capacity of spike trains in transferred networks is smaller than that of activation values in source ANN, resulting in less information being passed during SNN inference. To better correlate ANN and SNN for better performance, we propose a conversion framework to mitigate the gap between the activation value of source ANN and the generated spike train of target SNN. The conversion framework originates from exploring an identical relation in the conversion and exploits temporal separation scheme and novel neuron model for the relation to hold. We demonstrate almost lossless ANN-SNN conversion using SpikeConverter for a wide variety of networks on challenging datasets including CIFAR-10, CIFAR-100, and ImageNet. Our results also show that SpikeConverter achieves the abovementioned accuracy across different network architectures and datasets using 32X - 512X fewer inference time-steps than state-of-the-art ANN-SNN conversion methods.
引用
收藏
页码:1692 / 1701
页数:10
相关论文
共 50 条
  • [41] Neural Architecture Search for Spiking Neural Networks
    Kim, Youngeun
    Li, Yuhang
    Park, Hyoungseob
    Venkatesha, Yeshwanth
    Panda, Priyadarshini
    [J]. COMPUTER VISION, ECCV 2022, PT XXIV, 2022, 13684 : 36 - 56
  • [42] Incremental Neural Synthesis for Spiking Neural Networks
    Huy Le Nguyen
    Chu, Dominique
    [J]. 2022 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2022, : 649 - 656
  • [43] Attention Spiking Neural Networks
    Yao, Man
    Zhao, Guangshe
    Zhang, Hengyu
    Hu, Yifan
    Deng, Lei
    Tian, Yonghong
    Xu, Bo
    Li, Guoqi
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 9393 - 9410
  • [44] Simulation of spiking neural networks
    Bako, Laszlo
    Szekely, Iuliu
    David, Laszlo
    Brassai, Tihamer Sandor
    [J]. PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON OPTIMIZATION OF ELECTRICAL AND ELECTRONIC EQUIPMENT, VOL III: INDUSTRIAL AUTOMATION AND CONTROL, 2004, : 179 - 184
  • [45] Agreement in Spiking Neural Networks
    Kunev, Martin
    Kuznetsov, Petr
    Sheynikhovich, Denis
    [J]. JOURNAL OF COMPUTATIONAL BIOLOGY, 2022, 29 (04) : 358 - 369
  • [46] Applications of spiking neural networks
    Bohte, SM
    Kok, JN
    [J]. INFORMATION PROCESSING LETTERS, 2005, 95 (06) : 519 - 520
  • [47] A Survey on Spiking Neural Networks
    Han, Chan Sik
    Lee, Keon Myung
    [J]. INTERNATIONAL JOURNAL OF FUZZY LOGIC AND INTELLIGENT SYSTEMS, 2021, 21 (04) : 317 - 337
  • [48] Designing Spiking Neural Networks
    Dorogyy, Yaroslav
    Kolisnichenko, Vadym
    [J]. 2016 13TH INTERNATIONAL CONFERENCE ON MODERN PROBLEMS OF RADIO ENGINEERING, TELECOMMUNICATIONS AND COMPUTER SCIENCE (TCSET), 2016, : 124 - 127
  • [49] Spiking Neural Networks: A Survey
    Nunes, Joao D.
    Carvalho, Marcelo
    Carneiro, Diogo
    Cardoso, Jaime S.
    [J]. IEEE ACCESS, 2022, 10 : 60738 - 60764
  • [50] CQ+ Training: Minimizing Accuracy Loss in Conversion From Convolutional Neural Networks to Spiking Neural Networks
    Yan, Zhanglu
    Zhou, Jun
    Wong, Weng-Fai
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (10) : 11600 - 11611