Advancing Spiking Neural Networks Toward Deep Residual Learning

被引:0
|
作者
Hu, Yifan [1 ,2 ]
Deng, Lei [1 ]
Wu, Yujie [3 ]
Yao, Man [2 ,4 ]
Li, Guoqi [2 ,4 ]
机构
[1] Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518066, Peoples R China
[3] Graz Univ Technol, Inst Theoret Comp Sci, A-8010 Graz, Austria
[4] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
基金
美国国家科学基金会;
关键词
Degradation problem; neuromorphic computing; residual neural network; spiking neural network (SNN); INTELLIGENCE; MODEL;
D O I
10.1109/TNNLS.2024.3355393
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the rapid progress of neuromorphic computing, inadequate capacity and insufficient representation power of spiking neural networks (SNNs) severely restrict their application scope in practice. Residual learning and shortcuts have been evidenced as an important approach for training deep neural networks, but rarely did previous work assessed their applicability to the specifics of SNNs. In this article, we first identify that this negligence leads to impeded information flow and the accompanying degradation problem in a spiking version of vanilla ResNet. To address this issue, we propose a novel SNN-oriented residual architecture termed MS-ResNet, which establishes membrane-based shortcut pathways, and further proves that the gradient norm equality can be achieved in MS-ResNet by introducing block dynamical isometry theory, which ensures the network can be well-behaved in a depth-insensitive way. Thus, we are able to significantly extend the depth of directly trained SNNs, e.g., up to 482 layers on CIFAR-10 and 104 layers on ImageNet, without observing any slight degradation problem. To validate the effectiveness of MS-ResNet, experiments on both frame-based and neuromorphic datasets are conducted. MS-ResNet104 achieves a superior result of 76.02% accuracy on ImageNet, which is the highest to the best of our knowledge in the domain of directly trained SNNs. Great energy efficiency is also observed, with an average of only one spike per neuron needed to classify an input sample. We believe our powerful and scalable models will provide strong support for further exploration of SNNs.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [31] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [32] Backpropagation-Based Learning Techniques for Deep Spiking Neural Networks: A Survey
    Dampfhoffer, Manon
    Mesquida, Thomas
    Valentian, Alexandre
    Anghel, Lorena
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 35 (09) : 11906 - 11921
  • [33] Detection of weather images by using spiking neural networks of deep learning models
    Mesut Toğaçar
    Burhan Ergen
    Zafer Cömert
    [J]. Neural Computing and Applications, 2021, 33 : 6147 - 6159
  • [34] Neuromorphic Architectures for Spiking Deep Neural Networks
    Indiveri, Giacomo
    Corradi, Federico
    Qiao, Ning
    [J]. 2015 IEEE INTERNATIONAL ELECTRON DEVICES MEETING (IEDM), 2015,
  • [35] Deep learning and deep knowledge representation in Spiking Neural Networks for Brain-Computer Interfaces
    Kumarasinghe, Kaushalya
    Kasabov, Nikola
    Taylor, Denise
    [J]. NEURAL NETWORKS, 2020, 121 (121) : 169 - 185
  • [36] Learning spiking neuronal networks with artificial neural networks: neural oscillations
    Zhang, Ruilin
    Wang, Zhongyi
    Wu, Tianyi
    Cai, Yuhang
    Tao, Louis
    Xiao, Zhuo-Cheng
    Li, Yao
    [J]. JOURNAL OF MATHEMATICAL BIOLOGY, 2024, 88 (06)
  • [37] Deep Phasor Networks: Connecting Conventional and Spiking Neural Networks
    Olin-Ammentorp, Wilkie
    Bazhenov, Maxim
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [38] Autonomous Learning Paradigm for Spiking Neural Networks
    Liu, Junxiu
    McDaid, Liam J.
    Harkin, Jim
    Karim, Shvan
    Johnson, Anju P.
    Halliday, David M.
    Tyrrell, Andy M.
    Timmis, Jon
    Millard, Alan G.
    Hilder, James
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 737 - 744
  • [39] Comparison of learning methods for spiking neural networks
    Kukin K.
    Sboev A.
    [J]. Optical Memory and Neural Networks (Information Optics), 2015, 24 (02): : 123 - 129
  • [40] Learning long sequences in spiking neural networks
    Stan, Matei-Ioan
    Rhodes, Oliver
    [J]. SCIENTIFIC REPORTS, 2024, 14 (01):