Back-Propagation Learning in Deep Spike-By-Spike Networks

被引:10
|
作者
Rotermund, David [1 ]
Pawelzik, Klaus R. [1 ]
机构
[1] Univ Bremen, Inst Theoret Phys, Bremen, Germany
关键词
deep network (DN); spiking network model; sparseness; compressed sensing (CS); error back propagation (BP) neural network; FIRE NEURON MODEL; EXACT SIMULATION; SPARSITY;
D O I
10.3389/fncom.2019.00055
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Artificial neural networks (ANNs) are important building blocks in technical applications. They rely on noiseless continuous signals in stark contrast to the discrete action potentials stochastically exchanged among the neurons in real brains. We propose to bridge this gap with Spike-by-Spike (SbS) networks which represent a compromise between non-spiking and spiking versions of generative models. What is missing, however, are algorithms for finding weight sets that would optimize the output performances of deep SbS networks with many layers. Here, a learning rule for feed-forward SbS networks is derived. The properties of this approach are investigated and its functionality is demonstrated by simulations. In particular, a Deep Convolutional SbS network for classifying handwritten digits achieves a classification performance of roughly 99.3% on the MNIST test data when the learning rule is applied together with an optimizer. Thereby it approaches the benchmark results of ANNs without extensive parameter optimization. We envision this learning rule for SBS networks to provide a new basis for research in neuroscience and for technical applications, especially when they become implemented on specialized computational hardware.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] BENEFITS OF GAIN - SPEEDED LEARNING AND MINIMAL HIDDEN LAYERS IN BACK-PROPAGATION NETWORKS
    KRUSCHKE, JK
    MOVELLAN, JR
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1991, 21 (01): : 273 - 280
  • [42] Awesome back-propagation machine learning paradigm
    Assem Badr
    Neural Computing and Applications, 2021, 33 : 13225 - 13249
  • [43] New parallel algorithms for back-propagation learning
    Alves, RLD
    de Melo, JD
    Neto, ADD
    Albuquerque, ACML
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 2686 - 2691
  • [44] AN ACCELERATED ERROR BACK-PROPAGATION LEARNING ALGORITHM
    MAKRAMEBEID, S
    SIRAT, JA
    VIALA, JR
    PHILIPS JOURNAL OF RESEARCH, 1990, 44 (06) : 521 - 540
  • [45] AN ANALYSIS OF PREMATURE SATURATION IN BACK-PROPAGATION LEARNING
    LEE, Y
    OH, SH
    KIM, MW
    NEURAL NETWORKS, 1993, 6 (05) : 719 - 728
  • [46] Awesome back-propagation machine learning paradigm
    Badr, Assem
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (20): : 13225 - 13249
  • [47] Dendritic spike back propagation in the electrosensory lobe of Gnathonemus petersii
    Gómez, L
    Kanneworff, M
    Budelli, R
    Grant, K
    JOURNAL OF EXPERIMENTAL BIOLOGY, 2005, 208 (01): : 141 - 155
  • [48] Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [49] Distance- and activity-dependent modulation of spike back-propagation in layer V pyramidal neurons of the medial entorhinal cortex
    Gasparini, Sonia
    JOURNAL OF NEUROPHYSIOLOGY, 2011, 105 (03) : 1372 - 1379
  • [50] Learning to represent signals spike by spike
    Brendel, Wieland
    Bourdoukan, Ralph
    Vertechi, Pietro
    Machens, Christian K.
    Deneve, Sophie
    PLOS COMPUTATIONAL BIOLOGY, 2020, 16 (03)