Going Deeper With Directly-Trained Larger Spiking Neural Networks

被引:0
|
作者
Zheng, Hanle [1 ]
Wu, Yujie [1 ]
Deng, Lei [1 ,3 ]
Hu, Yifan [1 ]
Li, Guoqi [1 ,2 ]
机构
[1] Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Beijing Innovat Ctr Future Chip, Beijing 100084, Peoples R China
[3] Univ Calif Santa Barbara, Dept Elect & Comp Engn, Santa Barbara, CA 93106 USA
基金
国家重点研发计划;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are promising in a bio-plausible coding for spatio-temporal information and event-driven signal processing, which is very suited for energy-efficient implementation in neuromorphic hardware. However, the unique working mode of SNNs makes them more difficult to train than traditional networks. Currently, there are two main routes to explore the training of deep SNNs with high performance. The first is to convert a pre-trained ANN model to its SNN version, which usually requires a long coding window for convergence and cannot exploit the spatio-temporal features during training for solving temporal tasks. The other is to directly train SNNs in the spatio-temporal domain. But due to the binary spike activity of the firing function and the problem of gradient vanishing or explosion, current methods are restricted to shallow architectures and thereby difficult in harnessing large-scale datasets (e.g. ImageNet). To this end, we propose a threshold-dependent batch normalization (tdBN) method based on the emerging spatio-temporal backpropagation, termed "STBP-tdBN", enabling direct training of a very deep SNN and the efficient implementation of its inference on neuromorphic hardware. With the proposed method and elaborated shortcut connection, we significantly extend directly-trained SNNs from a shallow structure (<10 layer) to a very deep structure (50 layers). Furthermore, we theoretically analyze the effectiveness of our method based on "Block Dynamical Isometry" theory. Finally, we report superior accuracy results including 93.15% on CIFAR-10, 67.8% on DVS-CIFAR10, and 67.05% on ImageNet with very few timesteps. To our best knowledge, it's the first time to explore the directly-trained deep SNNs with high performance on ImageNet. We believe this work shall pave the way of fully exploiting the advantages of SNNs and attract more researchers to contribute in this field.
引用
收藏
页码:11062 / 11070
页数:9
相关论文
共 50 条
  • [21] Training much deeper spiking neural networks with a small number of time-steps
    Meng, Qingyan
    Yan, Shen
    Xiao, Mingqing
    Wang, Yisen
    Lin, Zhouchen
    Luo, Zhi-Quan
    NEURAL NETWORKS, 2022, 153 : 254 - 268
  • [22] Characterization of Generalizability of Spike Timing Dependent Plasticity Trained Spiking Neural Networks
    Chakraborty, Biswadeep
    Mukhopadhyay, Saibal
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [23] Bayesian computations in recurrent spiking neural networks trained to discriminate time intervals
    Serrano-Fernandez, Luis
    Beiran, Manuel
    Parga, Nestor
    JOURNAL OF COMPUTATIONAL NEUROSCIENCE, 2021, 49 (SUPPL 1) : S141 - S143
  • [24] Spiking Neural Networks Trained with Particle Swarm Optimization for Motor Imagery Classification
    Carino-Escobar, Ruben
    Cantillo-Negrete, Jessica
    Vazquez, Roberto A.
    Gutierrez-Martinez, Josefina
    ADVANCES IN SWARM INTELLIGENCE, ICSI 2016, PT II, 2016, 9713 : 245 - 252
  • [25] SPIKING NEURAL NETWORKS
    Ghosh-Dastidar, Samanwoy
    Adeli, Hojjat
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2009, 19 (04) : 295 - 308
  • [26] N-DriverMotion: Driver Motion Learning and Prediction Using an Event-Based Camera and Directly Trained Spiking Neural Networks on Loihi 2
    Chung, Hyo Jong
    Kang, Byungkon
    Yang, Yoon Seok
    IEEE OPEN JOURNAL OF VEHICULAR TECHNOLOGY, 2025, 6 : 68 - 80
  • [27] Can threshold networks be trained directly?
    Huang, GB
    Zhu, QY
    Mao, KZ
    Siew, CK
    Saratchandran, P
    Sundararajan, N
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2006, 53 (03) : 187 - 191
  • [28] Research of The Deeper Neural Networks
    Xiao, You Rong
    Wu, Qing Xiu
    Li, Shu Qing
    Ou, Jun
    2016 INTERNATIONAL CONFERENCE ON MECHATRONICS, MANUFACTURING AND MATERIALS ENGINEERING (MMME 2016), 2016, 63
  • [29] Topological features of spike trains in recurrent spiking neural networks that are trained to generate spatiotemporal patterns
    Maslennikov, Oleg
    Perc, Matjaz
    Nekorkin, Vladimir
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2024, 18
  • [30] SPIKING NEURAL NETWORKS TRAINED WITH BACKPROPAGATION FOR LOW POWER NEUROMORPHIC IMPLEMENTATION OF VOICE ACTIVITY DETECTION
    Martinelli, Flavio
    Dellaferrera, Giorgia
    Mainar, Pablo
    Cernak, Milos
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8544 - 8548