Deep learning in spiking neural networks

被引:632
|
作者
Tavanaei, Amirhossein [1 ]
Ghodrati, Masoud [2 ]
Kheradpisheh, Saeed Reza [3 ]
Masquelier, Timothee [4 ]
Maida, Anthony [1 ]
机构
[1] Univ Louisiana Lafayette, Sch Comp & Informat, Lafayette, LA 70504 USA
[2] Monash Univ, Dept Physiol, Clayton, Vic, Australia
[3] Kharazmi Univ, Fac Math Sci & Comp, Dept Comp Sci, Tehran, Iran
[4] Univ Toulouse 3, CNRS, CERCO, UMR 5549, F-31300 Toulouse, France
关键词
Deep learning; Spiking neural network; Biological plausibility; Machine learning; Power-efficient architecture; TIMING-DEPENDENT PLASTICITY; BELIEF NETWORK; TEMPORAL PRECISION; RECEPTIVE FIELDS; ADAPTIVE NETWORK; GRADIENT DESCENT; VISUAL FEATURES; SPARSE CODE; NEURONS; MODEL;
D O I
10.1016/j.neunet.2018.12.002
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, deep learning has revolutionized the field of machine learning, for computer vision in particular. In this approach, a deep (multilayer) artificial neural network (ANN) is trained, most often in a supervised manner using backpropagation. Vast amounts of labeled training examples are required, but the resulting classification accuracy is truly impressive, sometimes outperforming humans. Neurons in an ANN are characterized by a single, static, continuous-valued activation. Yet biological neurons use discrete spikes to compute and transmit information, and the spike times, in addition to the spike rates, matter. Spiking neural networks (SNNs) are thus more biologically realistic than ANNs, and are arguably the only viable option if one wants to understand how the brain computes at the neuronal description level. The spikes of biological neurons are sparse in time and space, and event-driven. Combined with bio-plausible local learning rules, this makes it easier to build low-power, neuromorphic hardware for SNNs. However, training deep SNNs remains a challenge. Spiking neurons' transfer function is usually non-differentiable, which prevents using backpropagation. Here we review recent supervised and unsupervised methods to train deep SNNs, and compare them in terms of accuracy and computational cost. The emerging picture is that SNNs still lag behind ANNs in terms of accuracy, but the gap is decreasing, and can even vanish on some tasks, while SNNs typically require many fewer operations and are the better candidates to process spatio-temporal data. (C) 2018 Elsevier Ltd. All rights reserved.
引用
收藏
页码:47 / 63
页数:17
相关论文
共 50 条
  • [1] Deep Residual Learning in Spiking Neural Networks
    Fang, Wei
    Yu, Zhaofei
    Chen, Yanqi
    Huang, Tiejun
    Masquelier, Timothee
    Tian, Yonghong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang, Lei
    Du, Zidong
    Li, Ling
    Chen, Yunji
    [J]. High Technology Letters, 2020, 26 (02): : 136 - 144
  • [3] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    [J]. High Technology Letters, 2020, 26 (02) : 136 - 144
  • [4] Advancing Spiking Neural Networks Toward Deep Residual Learning
    Hu, Yifan
    Deng, Lei
    Wu, Yujie
    Yao, Man
    Li, Guoqi
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [5] GRADUAL SURROGATE GRADIENT LEARNING IN DEEP SPIKING NEURAL NETWORKS
    Chen, Yi
    Zhang, Silin
    Ren, Shiyu
    Qu, Hong
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8927 - 8931
  • [6] Spiking neural networks for deep learning and knowledge representation: Editorial
    Kasabov, Nikola K.
    [J]. NEURAL NETWORKS, 2019, 119 : 341 - 342
  • [7] An Unsupervised Learning Algorithm for Deep Recurrent Spiking Neural Networks
    Du, Pangao
    Lin, Xianghong
    Pi, Xiaomei
    Wang, Xiangwen
    [J]. 2020 11TH IEEE ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2020, : 603 - 607
  • [8] Temporal Dependent Local Learning for Deep Spiking Neural Networks
    Ma, Chenxiang
    Xu, Junhai
    Yu, Qiang
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [9] Integrating Spiking Neural Networks and Deep Learning Algorithms on the Neurorobotics Platform
    Stentiford, Rachael
    Knowles, Thomas C.
    Feldotto, Benedikt
    Ergene, Deniz
    Morin, Fabrice O.
    Pearson, Martin J.
    [J]. BIOMIMETIC AND BIOHYBRID SYSTEMS, LIVING MACHINES 2022, 2022, 13548 : 68 - 79
  • [10] An Efficient Learning Algorithm for Direct Training Deep Spiking Neural Networks
    Zhu, Xiaolei
    Zhao, Baixin
    Ma, De
    Tang, Huajin
    [J]. IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (03) : 847 - 856