DSNNs: learning transfer from deep neural networks to spiking neural networks

被引:0
|
作者
Zhang L. [1 ,2 ,3 ]
Du Z. [1 ,2 ,3 ]
Li L. [4 ]
Chen Y. [1 ,2 ]
机构
[1] State Key Laboratory of Computer Architecture, Institute of Computing Technology, Chinese Academy of Sciences, Beijing
[2] Institute of Computing Technology, Chinese Academy of Sciences, Beijing
[3] Cambricon Tech. Ltd, Beijing
[4] Institute of Software, Chinese Academy of Sciences, Beijing
来源
High Technology Letters | 2020年 / 26卷 / 02期
基金
中国国家自然科学基金;
关键词
Convert method; Deep leaning; Spatially folded network; Spiking neural network (SNN);
D O I
10.3772/j.issn.1006-6748.2020.02.002
中图分类号
学科分类号
摘要
Deep neural networks (DNNs) have drawn great attention as they perform the state-of-the-art results on many tasks. Compared to DNNs, spiking neural networks (SNNs), which are considered as the new generation of neural networks, fail to achieve comparable performance especially on tasks with large problem sizes. Many previous work tried to close the gap between DNNs and SNNs but used small networks on simple tasks. This work proposes a simple but effective way to construct deep spiking neural networks (DSNNs) by transferring the learned ability of DNNs to SNNs. DSNNs achieve comparable accuracy on large networks and complex datasets. Copyright © by HIGH TECHNOLOGY LETTERS PRESS.
引用
下载
收藏
页码:136 / 144
页数:8
相关论文
共 50 条
  • [1] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    High Technology Letters, 2020, 26 (02) : 136 - 144
  • [2] Deep learning in spiking neural networks
    Tavanaei, Amirhossein
    Ghodrati, Masoud
    Kheradpisheh, Saeed Reza
    Masquelier, Timothee
    Maida, Anthony
    NEURAL NETWORKS, 2019, 111 : 47 - 63
  • [3] Deep Residual Learning in Spiking Neural Networks
    Fang, Wei
    Yu, Zhaofei
    Chen, Yanqi
    Huang, Tiejun
    Masquelier, Timothee
    Tian, Yonghong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [4] Effective Transfer Learning Algorithm in Spiking Neural Networks
    Zhan, Qiugang
    Liu, Guisong
    Xie, Xiurui
    Sun, Guolin
    Tang, Huajin
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (12) : 13323 - 13335
  • [5] Near Lossless Transfer Learning for Spiking Neural Networks
    Yan, Zhanglu
    Zhou, Jun
    Wong, Weng-Fai
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10577 - 10584
  • [6] Training Spiking Neural Networks Using Lessons From Deep Learning
    Eshraghian, Jason K.
    Ward, Max
    Neftci, Emre O.
    Wang, Xinxin
    Lenz, Gregor
    Dwivedi, Girish
    Bennamoun, Mohammed
    Jeong, Doo Seok
    Lu, Wei D.
    PROCEEDINGS OF THE IEEE, 2023, 111 (09) : 1016 - 1054
  • [7] Constructing Deep Spiking Neural Networks from Artificial Neural Networks with Knowledge Distillation
    Xu, Qi
    Li, Yaxin
    Shen, Jiangrong
    Liu, Jian K.
    Tang, Huajin
    Pan, Gang
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 7886 - 7895
  • [8] Advancing Spiking Neural Networks Toward Deep Residual Learning
    Hu, Yifan
    Deng, Lei
    Wu, Yujie
    Yao, Man
    Li, Guoqi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [9] GRADUAL SURROGATE GRADIENT LEARNING IN DEEP SPIKING NEURAL NETWORKS
    Chen, Yi
    Zhang, Silin
    Ren, Shiyu
    Qu, Hong
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8927 - 8931
  • [10] Spiking neural networks for deep learning and knowledge representation: Editorial
    Kasabov, Nikola K.
    NEURAL NETWORKS, 2019, 119 : 341 - 342