Back-Propagation Learning in Deep Spike-By-Spike Networks

被引:10
|
作者
Rotermund, David [1 ]
Pawelzik, Klaus R. [1 ]
机构
[1] Univ Bremen, Inst Theoret Phys, Bremen, Germany
关键词
deep network (DN); spiking network model; sparseness; compressed sensing (CS); error back propagation (BP) neural network; FIRE NEURON MODEL; EXACT SIMULATION; SPARSITY;
D O I
10.3389/fncom.2019.00055
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Artificial neural networks (ANNs) are important building blocks in technical applications. They rely on noiseless continuous signals in stark contrast to the discrete action potentials stochastically exchanged among the neurons in real brains. We propose to bridge this gap with Spike-by-Spike (SbS) networks which represent a compromise between non-spiking and spiking versions of generative models. What is missing, however, are algorithms for finding weight sets that would optimize the output performances of deep SbS networks with many layers. Here, a learning rule for feed-forward SbS networks is derived. The properties of this approach are investigated and its functionality is demonstrated by simulations. In particular, a Deep Convolutional SbS network for classifying handwritten digits achieves a classification performance of roughly 99.3% on the MNIST test data when the learning rule is applied together with an optimizer. Thereby it approaches the benchmark results of ANNs without extensive parameter optimization. We envision this learning rule for SBS networks to provide a new basis for research in neuroscience and for technical applications, especially when they become implemented on specialized computational hardware.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Accelerator Framework of Spike-By-Spike Neural Networks for Inference and Incremental Learning in Embedded Systems
    Nevarez, Yarib
    Garcia-Ortiz, Alberto
    Rotermund, David
    Pawelzik, Klaus R.
    2020 9TH INTERNATIONAL CONFERENCE ON MODERN CIRCUITS AND SYSTEMS TECHNOLOGIES (MOCAST), 2020,
  • [2] Dendritic back-propagation: physiological spike trains do better
    不详
    NEUROSCIENTIST, 2001, 7 (03): : 185 - 186
  • [3] BACK-PROPAGATION LEARNING IN EXPERT NETWORKS
    LACHER, RC
    HRUSKA, SI
    KUNCICKY, DC
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01): : 62 - 72
  • [4] Dendritic excitations govern back-propagation via a spike-rate accelerometer
    Park, Pojeong
    Wong-Campos, J. David
    Itkis, Daniel G.
    Lee, Byung Hun
    Qi, Yitong
    Davis, Hunter C.
    Antin, Benjamin
    Pasarkar, Amol
    Grimm, Jonathan B.
    Plutkis, Sarah E.
    Holland, Katie L.
    Paninski, Liam
    Lavis, Luke D.
    Cohen, Adam E.
    NATURE COMMUNICATIONS, 2025, 16 (01)
  • [5] Correlation of multiple neuronal spike trains using the back-propagation error correction algorithm
    Tam, D.C.
    Perkel, D.H.
    Tucker, W.S.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [6] The HSIC Bottleneck: Deep Learning without Back-Propagation
    Ma, Wan-Duo Kurt
    Lewis, J. P.
    Kleijn, W. Bastiaan
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5085 - 5092
  • [7] Temporal correction of multiple neuronal spike trains using the back-propagation error correction algorithm
    Tam, D.C.
    Perkel, D.H.
    Tucker, W.S.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [8] Layer multiplexing FPGA implementation for deep back-propagation learning
    Ortega-Zamorano, Francisco
    Jerez, Jose M.
    Gomez, Ivan
    Franco, Leonardo
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2017, 24 (02) : 171 - 185
  • [9] Adaptive back-propagation in on-line learning of multilayer networks
    West, AHL
    Saad, D
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 323 - 329
  • [10] Axonal Na+ Channels Ensure Fast Spike Activation and Back-Propagation in Cerebellar Granule Cells
    Diwakar, Shyam
    Magistretti, Jacopo
    Goldfarb, Mitchell
    Naldi, Giovanni
    D'Angelo, Egidio
    JOURNAL OF NEUROPHYSIOLOGY, 2009, 101 (02) : 519 - 532