Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation

被引:25
|
作者
Xu, Qi [1 ]
Li, Yaxin [1 ]
Shen, Jiangrong [2 ]
Liu, Jian K. [3 ]
Tang, Huajin [2 ]
Pan, Gang [2 ]
机构
[1] Dalin Univ Technol, Sch Artificial Intelligence, Dalian, Peoples R China
[2] Zhejiang Univ, Coll Comp Sci & Technol, Hangzhou, Peoples R China
[3] Univ Leeds, Sch Comp, Leeds, W Yorkshire, England
基金
中国国家自然科学基金;
关键词
D O I
10.1109/CVPR52729.2023.00762
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are well-known as brain-inspired models with high computing efficiency, due to a key component that they utilize spikes as information units, close to the biological neural systems. Although spiking based models are energy efficient by taking advantage of discrete spike signals, their performance is limited by current network structures and their training methods. As discrete signals, typical SNNs cannot apply the gradient descent rules directly into parameter adjustment as artificial neural networks (ANNs). Aiming at this limitation, here we propose a novel method of constructing deep SNN models with knowledge distillation (KD) that uses ANN as the teacher model and SNN as the student model. Through the ANN-SNN joint training algorithm, the student SNN model can learn rich feature information from the teacher ANN model through the KD method, yet it avoids training SNN from scratch when communicating with non-differentiable spikes. Our method can not only build a more efficient deep spiking structure feasibly and reasonably but use few time steps to train the whole model compared to direct training or ANN to SNN methods. More importantly, it has a superb ability of noise immunity for various types of artificial noises and natural signals. The proposed novel method provides efficient ways to improve the performance of SNN through constructing deeper structures in a high-throughput fashion, with potential usage for light and efficient brain-inspired computing of practical scenarios.
引用
收藏
页码:7886 / 7895
页数:10
相关论文
共 50 条
  • [1] Distilling Spikes: Knowledge Distillation in Spiking Neural Networks
    Kushawaha, Ravi Kumar
    Kumar, Saurabh
    Banerjee, Biplab
    Velmurugan, Rajbabu
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 4536 - 4543
  • [2] Self-architectural knowledge distillation for spiking neural networks
    Qiu, Haonan
    Ning, Munan
    Song, Zeyin
    Fang, Wei
    Chen, Yanqi
    Sun, Tao
    Ma, Zhengyu
    Yuan, Li
    Tian, Yonghong
    [J]. NEURAL NETWORKS, 2024, 178
  • [3] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang, Lei
    Du, Zidong
    Li, Ling
    Chen, Yunji
    [J]. High Technology Letters, 2020, 26 (02): : 136 - 144
  • [4] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    [J]. High Technology Letters, 2020, 26 (02) : 136 - 144
  • [5] Knowledge Distillation for Optimization of Quantized Deep Neural Networks
    Shin, Sungho
    Boo, Yoonho
    Sung, Wonyong
    [J]. 2020 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2020, : 111 - 116
  • [6] Improving the Interpretability of Deep Neural Networks with Knowledge Distillation
    Liu, Xuan
    Wang, Xiaoguang
    Matwin, Stan
    [J]. 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 905 - 912
  • [7] Learning spiking neuronal networks with artificial neural networks: neural oscillations
    Zhang, Ruilin
    Wang, Zhongyi
    Wu, Tianyi
    Cai, Yuhang
    Tao, Louis
    Xiao, Zhuo-Cheng
    Li, Yao
    [J]. JOURNAL OF MATHEMATICAL BIOLOGY, 2024, 88 (06)
  • [8] Spiking neural networks for deep learning and knowledge representation: Editorial
    Kasabov, Nikola K.
    [J]. NEURAL NETWORKS, 2019, 119 : 341 - 342
  • [9] EGNN: Constructing explainable graph neural networks via knowledge distillation
    Li, Yuan
    Liu, Li
    Wang, Guoyin
    Du, Yong
    Chen, Penggang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 241
  • [10] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    [J]. NEURAL NETWORKS, 2019, 111 : 47 - 63