Going Deeper With Directly-Trained Larger Spiking Neural Networks

被引:0
|
作者
Zheng, Hanle [1 ]
Wu, Yujie [1 ]
Deng, Lei [1 ,3 ]
Hu, Yifan [1 ]
Li, Guoqi [1 ,2 ]
机构
[1] Tsinghua Univ, Ctr Brain Inspired Comp Res, Dept Precis Instrument, Beijing 100084, Peoples R China
[2] Tsinghua Univ, Beijing Innovat Ctr Future Chip, Beijing 100084, Peoples R China
[3] Univ Calif Santa Barbara, Dept Elect & Comp Engn, Santa Barbara, CA 93106 USA
基金
国家重点研发计划;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spiking neural networks (SNNs) are promising in a bio-plausible coding for spatio-temporal information and event-driven signal processing, which is very suited for energy-efficient implementation in neuromorphic hardware. However, the unique working mode of SNNs makes them more difficult to train than traditional networks. Currently, there are two main routes to explore the training of deep SNNs with high performance. The first is to convert a pre-trained ANN model to its SNN version, which usually requires a long coding window for convergence and cannot exploit the spatio-temporal features during training for solving temporal tasks. The other is to directly train SNNs in the spatio-temporal domain. But due to the binary spike activity of the firing function and the problem of gradient vanishing or explosion, current methods are restricted to shallow architectures and thereby difficult in harnessing large-scale datasets (e.g. ImageNet). To this end, we propose a threshold-dependent batch normalization (tdBN) method based on the emerging spatio-temporal backpropagation, termed "STBP-tdBN", enabling direct training of a very deep SNN and the efficient implementation of its inference on neuromorphic hardware. With the proposed method and elaborated shortcut connection, we significantly extend directly-trained SNNs from a shallow structure (<10 layer) to a very deep structure (50 layers). Furthermore, we theoretically analyze the effectiveness of our method based on "Block Dynamical Isometry" theory. Finally, we report superior accuracy results including 93.15% on CIFAR-10, 67.8% on DVS-CIFAR10, and 67.05% on ImageNet with very few timesteps. To our best knowledge, it's the first time to explore the directly-trained deep SNNs with high performance on ImageNet. We believe this work shall pave the way of fully exploiting the advantages of SNNs and attract more researchers to contribute in this field.
引用
收藏
页码:11062 / 11070
页数:9
相关论文
共 50 条
  • [1] Deep Directly-Trained Spiking Neural Networks for Object Detection
    Su, Qiaoyi
    Chou, Yuhong
    Hu, Yifan
    Li, Jianing
    Mei, Shijie
    Zhang, Ziyang
    Li, Guoqi
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 6532 - 6542
  • [2] Going Deeper in Spiking Neural Networks: VGG and Residual Architectures
    Sengupta, Abhronil
    Ye, Yuting
    Wang, Robert
    Liu, Chiao
    Roy, Kaushik
    FRONTIERS IN NEUROSCIENCE, 2019, 13
  • [3] Directly-trained Spiking Neural Networks for Deep Reinforcement Learning: Energy efficient implementation of event-based obstacle avoidance on a neuromorphic accelerator
    Zanatta, Luca
    Di Mauro, Alfio
    Barchi, Francesco
    Bartolini, Andrea
    Benini, Luca
    Acquaviva, Andrea
    NEUROCOMPUTING, 2023, 562
  • [4] Spiking Neural Networks Trained via Proxy
    Kheradpisheh, Saeed Reza
    Mirsadeghi, Maryam
    Masquelier, Timothee
    IEEE ACCESS, 2022, 10 : 70769 - 70778
  • [5] Going Deeper: Autonomous Steering with Neural Memory Networks
    Fernando, Tharindu
    Denman, Simon
    Sridharan, Sridha
    Fookes, Clinton
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW 2017), 2017, : 214 - 221
  • [6] GOING DEEPER WITH BRAIN MORPHOMETRY USING NEURAL NETWORKS
    Santa Cruz, Rodrigo
    Lebrat, Leo
    Bourgeat, Pierrick
    Dore, Vincent
    Dowling, Jason
    Fripp, Jurgen
    Fookes, Clinton
    Salvado, Olivier
    2021 IEEE 18TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2021, : 711 - 715
  • [7] GOING DEEPER WITH NEURAL NETWORKS WITHOUT SKIP CONNECTIONS
    Oyedotun, Oyebade K.
    Shabayek, Abd El Rahman
    Aouada, Djamila
    Ottersten, Bjoern
    2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 1756 - 1760
  • [8] Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing
    Kim, Youngeun
    Panda, Priyadarshini
    NEURAL NETWORKS, 2021, 144 : 686 - 698
  • [9] Memristive Spiking Neural Networks Trained with Unsupervised STDP
    Zhou, Errui
    Fang, Liang
    Yang, Binbin
    ELECTRONICS, 2018, 7 (12)
  • [10] Direct Training for Spiking Neural Networks: Faster, Larger, Better
    Wu, Yujie
    Deng, Lei
    Li, Guoqi
    Zhu, Jun
    Xie, Yuan
    Shi, Luping
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 1311 - 1318