Back-Propagation Learning in Deep Spike-By-Spike Networks

被引:10
|
作者
Rotermund, David [1 ]
Pawelzik, Klaus R. [1 ]
机构
[1] Univ Bremen, Inst Theoret Phys, Bremen, Germany
关键词
deep network (DN); spiking network model; sparseness; compressed sensing (CS); error back propagation (BP) neural network; FIRE NEURON MODEL; EXACT SIMULATION; SPARSITY;
D O I
10.3389/fncom.2019.00055
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Artificial neural networks (ANNs) are important building blocks in technical applications. They rely on noiseless continuous signals in stark contrast to the discrete action potentials stochastically exchanged among the neurons in real brains. We propose to bridge this gap with Spike-by-Spike (SbS) networks which represent a compromise between non-spiking and spiking versions of generative models. What is missing, however, are algorithms for finding weight sets that would optimize the output performances of deep SbS networks with many layers. Here, a learning rule for feed-forward SbS networks is derived. The properties of this approach are investigated and its functionality is demonstrated by simulations. In particular, a Deep Convolutional SbS network for classifying handwritten digits achieves a classification performance of roughly 99.3% on the MNIST test data when the learning rule is applied together with an optimizer. Thereby it approaches the benchmark results of ANNs without extensive parameter optimization. We envision this learning rule for SBS networks to provide a new basis for research in neuroscience and for technical applications, especially when they become implemented on specialized computational hardware.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] On the performance of back-propagation networks in econometric analysis
    Guillen, Montserrat
    Soldevilla, Carlos
    Informatica (Ljubljana), 1996, 20 (04): : 435 - 441
  • [32] BACK-PROPAGATION
    JONES, WP
    HOSKINS, J
    BYTE, 1987, 12 (11): : 155 - &
  • [33] Modelling symmetry detection with back-propagation networks
    Latimer, C.
    Joung, W.
    Stevens, C.
    Spatial Vision, 1994, 8 (04):
  • [34] GENERALIZATION OF BACK-PROPAGATION TO RECURRENT NEURAL NETWORKS
    PINEDA, FJ
    PHYSICAL REVIEW LETTERS, 1987, 59 (19) : 2229 - 2232
  • [35] Learning Physics-Informed Neural Networks without Stacked Back-propagation
    He, Di
    Li, Shanda
    Shi, Wenlei
    Gao, Xiaotian
    Zhang, Jia
    Bian, Jiang
    Wang, Liwei
    Liu, Tie-Yan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [37] RRAM-Based Binary Neural Networks Using Back-Propagation Learning
    Jiang, Yuning
    Yang, Yunfan
    Zhou, Zheng
    Xiang, Yachen
    Huang, Peng
    Kang, Jinfeng
    2018 14TH IEEE INTERNATIONAL CONFERENCE ON SOLID-STATE AND INTEGRATED CIRCUIT TECHNOLOGY (ICSICT), 2018, : 1252 - 1254
  • [38] LEARNING BY BACK-PROPAGATION - COMPUTING IN A SYSTOLIC WAY
    MILLAN, JD
    BOFILL, P
    LECTURE NOTES IN COMPUTER SCIENCE, 1989, 366 : 235 - 252
  • [39] Error measures of the back-propagation learning algorithm
    Fujiki, S
    Nakao, M
    Fujiki, NM
    JOURNAL OF THE KOREAN PHYSICAL SOCIETY, 2002, 40 (06) : 1091 - 1095
  • [40] On-line learning with adaptive back-propagation in two-layer networks
    West, AHL
    Saad, D
    PHYSICAL REVIEW E, 1997, 56 (03): : 3426 - 3445