Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation

被引:25
|
作者
Xu, Qi [1 ]
Li, Yaxin [1 ]
Shen, Jiangrong [2 ]
Liu, Jian K. [3 ]
Tang, Huajin [2 ]
Pan, Gang [2 ]
机构
[1] Dalin Univ Technol, Sch Artificial Intelligence, Dalian, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
[3] Univ Leeds, Sch Comp, Leeds, W Yorkshire, England
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR52729.2023.00762
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are well-known as brain-inspired models with high computing efficiency, due to a key component that they utilize spikes as information units, close to the biological neural systems. Although spiking based models are energy efficient by taking advantage of discrete spike signals, their performance is limited by current network structures and their training methods. As discrete signals, typical SNNs cannot apply the gradient descent rules directly into parameter adjustment as artificial neural networks (ANNs). Aiming at this limitation, here we propose a novel method of constructing deep SNN models with knowledge distillation (KD) that uses ANN as the teacher model and SNN as the student model. Through the ANN-SNN joint training algorithm, the student SNN model can learn rich feature information from the teacher ANN model through the KD method, yet it avoids training SNN from scratch when communicating with non-differentiable spikes. Our method can not only build a more efficient deep spiking structure feasibly and reasonably but use few time steps to train the whole model compared to direct training or ANN to SNN methods. More importantly, it has a superb ability of noise immunity for various types of artificial noises and natural signals. The proposed novel method provides efficient ways to improve the performance of SNN through constructing deeper structures in a high-throughput fashion, with potential usage for light and efficient brain-inspired computing of practical scenarios.
引用
收藏
页码:7886 / 7895
页数:10
相关论文
共 50 条
  • [41] Neuromorphic deep spiking neural networks for seizure detection
    Yang, Yikai
    Eshraghian, Jason K.
    Truong, Nhan Duy
    Nikpour, Armin
    Kavehei, Omid
    [J]. NEUROMORPHIC COMPUTING AND ENGINEERING, 2023, 3 (01):
  • [42] Benchmarking Deep Spiking Neural Networks on Neuromorphic Hardware
    Ostrau, Christoph
    Homburg, Jonas
    Klarhorst, Christian
    Thies, Michael
    Rueckert, Ulrich
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 610 - 621
  • [43] Training Deep Spiking Neural Networks Using Backpropagation
    Lee, Jun Haeng
    Delbruck, Tobi
    Pfeiffer, Michael
    [J]. FRONTIERS IN NEUROSCIENCE, 2016, 10
  • [44] Reducing the Spike Rate in Deep Spiking Neural Networks
    Fontanini, Riccardo
    Esseni, David
    Loghi, Mirko
    [J]. PROCEEDINGS OF INTERNATIONAL CONFERENCE ON NEUROMORPHIC SYSTEMS 2022, ICONS 2022, 2022,
  • [45] Training Spiking Neural Networks Using Lessons From Deep Learning
    Eshraghian, Jason K.
    Ward, Max
    Neftci, Emre O.
    Wang, Xinxin
    Lenz, Gregor
    Dwivedi, Girish
    Bennamoun, Mohammed
    Jeong, Doo Seok
    Lu, Wei D.
    [J]. PROCEEDINGS OF THE IEEE, 2023, 111 (09) : 1016 - 1054
  • [46] Temporal Pattern Coding in Deep Spiking Neural Networks
    Rueckauer, Bodo
    Liu, Shih-Chii
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [47] Deep Neural Networks with Knowledge Instillation
    Yang, Fan
    Liu, Ninghao
    Du, Mengnan
    Zhou, Kaixiong
    Ji, Shuiwang
    Hu, Xia
    [J]. PROCEEDINGS OF THE 2020 SIAM INTERNATIONAL CONFERENCE ON DATA MINING (SDM), 2020, : 370 - 378
  • [48] From evolving artificial gene regulatory networks to evolving spiking neural networks for pattern recognition
    Ahmed Abdelmotaleb
    Maria Schilstra
    Neil Davey
    Volker Steuber
    Borys Wróbel
    [J]. BMC Neuroscience, 14 (Suppl 1)
  • [49] Constructing Accurate and Efficient Deep Spiking Neural Networks With Double-Threshold and Augmented Schemes
    Yu, Qiang
    Ma, Chenxiang
    Song, Shiming
    Zhang, Gaoyan
    Dang, Jianwu
    Tan, Kay Chen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (04) : 1714 - 1726
  • [50] BSNN: Towards faster and better conversion of artificial neural networks to spiking neural networks with bistable neurons
    Li, Yang
    Zhao, Dongcheng
    Zeng, Yi
    [J]. FRONTIERS IN NEUROSCIENCE, 2022, 16