Exploiting heterogeneity in operational neural networks by synaptic plasticity

被引:14
|
作者
Kiranyaz, Serkan [1 ]
Malik, Junaid [1 ,4 ]
Abdallah, Habib Ben [1 ]
Ince, Turker [2 ]
Iosifidis, Alexandros [3 ]
Gabbouj, Moncef [4 ]
机构
[1] Qatar Univ, Coll Engn, Elect Engn, Doha, Qatar
[2] Izmir Univ Econ, Elect & Elect Engn Dept, Izmir, Turkey
[3] Aarhus Univ, Dept Engn, Aarhus, Denmark
[4] Tampere Univ, Dept Signal Proc, Tampere, Finland
来源
NEURAL COMPUTING & APPLICATIONS | 2021年 / 33卷 / 13期
关键词
Operational neural networks; Convolutional neural networks; Synaptic Plasticity; NEURONAL DIVERSITY; REPRESENTATIONS;
D O I
10.1007/s00521-020-05543-w
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs) that are homogenous only with a linear neuron model. As a heterogenous network model, ONNs are based on a generalized neuron model that can encapsulate any set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, the default search method to find optimal operators in ONNs, the so-called Greedy Iterative Search (GIS) method, usually takes several training sessions to find a single operator set per layer. This is not only computationally demanding, also the network heterogeneity is limited since the same set of operators will then be used for all neurons in each layer. To address this deficiency and exploit a superior level of heterogeneity, in this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the "Synaptic Plasticity" paradigm that poses the essential learning theory in biological neurons. During training, each operator set in the library can be evaluated by their synaptic plasticity level, ranked from the worst to the best, and an "elite" ONN can then be configured using the top-ranked operator sets found at each hidden layer. Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs and as a result, the performance gap over the CNNs further widens.
引用
收藏
页码:7997 / 8015
页数:19
相关论文
共 50 条
  • [21] Emergent functional neural networks organized by spike timing dependent synaptic plasticity
    Chang-Woo Shin
    Seunghwan Kim
    BMC Neuroscience, 8 (Suppl 2)
  • [22] Unsupervised discrimination of patterns in spiking neural networks with excitatory and inhibitory synaptic plasticity
    Srinivasa, Narayan
    Cho, Youngkwan
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2014, 8
  • [23] Memristor-based synaptic plasticity and unsupervised learning of spiking neural networks
    Hajiabadi, Zohreh
    Shalchian, Majid
    JOURNAL OF COMPUTATIONAL ELECTRONICS, 2021, 20 (04) : 1625 - 1636
  • [24] Synaptic vesicle pool heterogeneity drives an anomalous form of synaptic plasticity
    Guzikowski, Natalie J.
    Kavalali, Ege T.
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2024, 121 (11)
  • [25] Neural field theory of synaptic plasticity
    Robinson, P. A.
    JOURNAL OF THEORETICAL BIOLOGY, 2011, 285 (01) : 156 - 163
  • [26] Neural recognition molecules and synaptic plasticity
    Schachner, M
    CURRENT OPINION IN CELL BIOLOGY, 1997, 9 (05) : 627 - 634
  • [27] Neural morphogenesis, synaptic plasticity, and evolution
    Dario Floreano
    Joseba Urzelai
    Theory in Biosciences, 2001, 120 : 225 - 240
  • [28] STOCHASTIC MODELS OF NEURAL SYNAPTIC PLASTICITY
    Robert, Philippe
    Vignoud, Gaetan
    SIAM JOURNAL ON APPLIED MATHEMATICS, 2021, 81 (05) : 1821 - 1846
  • [29] Neural morphogenesis, synaptic plasticity, and evolution
    Floreano, D
    Urzelai, J
    THEORY IN BIOSCIENCES, 2001, 120 (3-4) : 225 - 240
  • [30] Operational neural networks
    Serkan Kiranyaz
    Turker Ince
    Alexandros Iosifidis
    Moncef Gabbouj
    Neural Computing and Applications, 2020, 32 : 6645 - 6668