Modeling and contractivity of neural-synaptic networks with Hebbian learning☆

被引:2
|
作者
Centorrino, Veronica [1 ]
Bullo, Francesco [2 ,3 ]
Russo, Giovanni [4 ]
机构
[1] Scuola Super Meridionale, Naples, Italy
[2] Univ Calif Santa Barbara, Dept Mech Engn, Santa Barbara, CA USA
[3] Univ Calif Santa Barbara, Ctr Control Dynam Syst & Computat, Santa Barbara, CA USA
[4] Univ Salerno, Dept Informat Engn Elect Engn & Appl Math, Salerno, Italy
关键词
Nonlinear network systems; Hebbian/anti-Hebbian learning; Contraction theory; STABILITY ANALYSIS;
D O I
10.1016/j.automatica.2024.111636
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper is concerned with the modeling and analysis of two of the most commonly used recurrent neural network models (i.e., Hopfield neural network and firing-rate neural network) with dynamic recurrent connections undergoing Hebbian learning rules. To capture the synaptic sparsity of neural circuits we propose a low dimensional formulation. We then characterize certain key dynamical properties. First, we give biologically-inspired forward invariance results. Then, we give sufficient conditions for the non-Euclidean contractivity of the models. Our contraction analysis leads to stability and robustness of time-varying trajectories - for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. For each model, we propose a contractivity test based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum indegree, and the maximum synaptic strength. Then, we show that the models satisfy Dale's Principle. Finally, we illustrate the effectiveness of our results via a numerical example. (c) 2024 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Organization of receptive fields in networks with Hebbian learning: The connection between synaptic and phenomenological models
    Shouval, H
    Cooper, LN
    BIOLOGICAL CYBERNETICS, 1996, 74 (05) : 439 - 447
  • [22] Euclidean Contractivity of Neural Networks With Symmetric Weights
    Centorrino, Veronica
    Gokhale, Anand
    Davydov, Alexander
    Russo, Giovanni
    Bullo, Francesco
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 1724 - 1729
  • [23] INFORMATIONAL CHARACTERISTICS OF NEURAL NETWORKS CAPABLE OF ASSOCIATIVE LEARNING BASED ON HEBBIAN PLASTICITY
    FROLOV, AA
    MURAVEV, IP
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1993, 4 (04) : 495 - 536
  • [24] Convolutional Neural Networks with Hebbian-Based Rules in Online Transfer Learning
    Aguilar Canto, Fernando Javier
    ADVANCES IN SOFT COMPUTING, MICAI 2020, PT I, 2020, 12468 : 35 - 49
  • [25] No need to forget, just keep the balance: Hebbian neural networks for statistical learning
    Tovar, Angel Eugenio
    Westermann, Gert
    COGNITION, 2023, 230
  • [26] The road to chaos by time-asymmetric Hebbian learning in recurrent neural networks
    Molter, Colin
    Salihoglu, Utku
    Bersini, Hugues
    NEURAL COMPUTATION, 2007, 19 (01) : 80 - 110
  • [27] Comparing the performance of Hebbian against backpropagation learning using convolutional neural networks
    Lagani, Gabriele
    Falchi, Fabrizio
    Gennaro, Claudio
    Amato, Giuseppe
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (08): : 6503 - 6519
  • [28] Adaptive Spiking Neural Networks with Hodgkin-Huxley Neurons and Hebbian Learning
    Long, Lyle N.
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 165 - 165
  • [29] Comparing the performance of Hebbian against backpropagation learning using convolutional neural networks
    Gabriele Lagani
    Fabrizio Falchi
    Claudio Gennaro
    Giuseppe Amato
    Neural Computing and Applications, 2022, 34 : 6503 - 6519
  • [30] Generalized Quadratic Synaptic Neural Networks for ETo Modeling
    Adamala S.
    Raghuwanshi N.S.
    Mishra A.
    Environmental Processes, 2015, 2 (2) : 309 - 329