Optimization Techniques for Conversion of Quantization Aware Trained Deep Neural Networks to Lightweight Spiking Neural Networks

被引:0
|
作者
Lee, Kyungchul [1 ]
Choi, Sunghyun [1 ]
Lew, Dongwoo [1 ]
Park, Jongsun [1 ]
机构
[1] Korea Univ, Seoul, South Korea
关键词
spiking neural networks; ANN-SNN conversion; quantization aware training;
D O I
10.1109/ITC-CSCC52171.2021.9501427
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we present spiking neural network (SNN) conversion technique optimized for converting low bit-width artificial neural networks (ANN) trained with quantization aware training (QAT). Conventional conversion technique suffers significant accuracy drop on QAT ANNs due to different activation function used for QAT ANNs. To minimize such accuracy drop of the conventional conversion, the proposed technique uses Spike-Norm Skip, which selectively applies threshold balancing. In addition, subtraction based reset is used to further reduce accuracy degradation. The proposed conversion technique achieves an accuracy of 89.92% (0.68% drop) with a 5-bit weight on CIFAR-10 using VGG-16.
引用
收藏
页数:3
相关论文
共 50 条
  • [1] Training-aware Low Precision Quantization in Spiking Neural Networks
    Shymyrbay, Ayan
    Fouda, Mohammed E.
    Eltawil, Ahmed
    [J]. 2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 1147 - 1151
  • [2] Deep Directly-Trained Spiking Neural Networks for Object Detection
    Su, Qiaoyi
    Chou, Yuhong
    Hu, Yifan
    Li, Jianing
    Mei, Shijie
    Zhang, Ziyang
    Li, Guoqi
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6532 - 6542
  • [3] Spiking Neural Networks Trained via Proxy
    Kheradpisheh, Saeed Reza
    Mirsadeghi, Maryam
    Masquelier, Timothee
    [J]. IEEE ACCESS, 2022, 10 : 70769 - 70778
  • [4] SQUAT: Stateful Quantization-Aware Training in Recurrent Spiking Neural Networks
    Venkatesh, Sreyes
    Marinescu, Razvan
    Eshraghian, Jason K.
    [J]. 2024 NEURO INSPIRED COMPUTATIONAL ELEMENTS CONFERENCE, NICE, 2024,
  • [5] Dataset Conversion for Spiking Neural Networks
    Sadovsky, Erik
    Jakubec, Maros
    Jarinova, Darina
    Jarina, Roman
    [J]. 2023 33RD INTERNATIONAL CONFERENCE RADIOELEKTRONIKA, RADIOELEKTRONIKA, 2023,
  • [6] Quantization Framework for Fast Spiking Neural Networks
    Li, Chen
    Ma, Lei
    Furber, Steve
    [J]. FRONTIERS IN NEUROSCIENCE, 2022, 16
  • [7] Trainable quantization for Speedy Spiking Neural Networks
    Castagnetti, Andrea
    Pegatoquet, Alain
    Miramond, Benoit
    [J]. FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [8] Low Precision Quantization-aware Training in Spiking Neural Networks with Differentiable Quantization Function
    Shymyrbay, Ayan
    Fouda, Mohammed E.
    Eltawil, Ahmed
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [9] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    [J]. NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [10] ON THE PROBABILISTIC OPTIMIZATION OF SPIKING NEURAL NETWORKS
    Schliebs, Stefan
    Kasabov, Nikola
    Defoin-Platel, Michael
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2010, 20 (06) : 481 - 500