Modeling and contractivity of neural-synaptic networks with Hebbian learning☆

被引:2
|
作者
Centorrino, Veronica [1 ]
Bullo, Francesco [2 ,3 ]
Russo, Giovanni [4 ]
机构
[1] Scuola Super Meridionale, Naples, Italy
[2] Univ Calif Santa Barbara, Dept Mech Engn, Santa Barbara, CA USA
[3] Univ Calif Santa Barbara, Ctr Control Dynam Syst & Computat, Santa Barbara, CA USA
[4] Univ Salerno, Dept Informat Engn Elect Engn & Appl Math, Salerno, Italy
关键词
Nonlinear network systems; Hebbian/anti-Hebbian learning; Contraction theory; STABILITY ANALYSIS;
D O I
10.1016/j.automatica.2024.111636
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper is concerned with the modeling and analysis of two of the most commonly used recurrent neural network models (i.e., Hopfield neural network and firing-rate neural network) with dynamic recurrent connections undergoing Hebbian learning rules. To capture the synaptic sparsity of neural circuits we propose a low dimensional formulation. We then characterize certain key dynamical properties. First, we give biologically-inspired forward invariance results. Then, we give sufficient conditions for the non-Euclidean contractivity of the models. Our contraction analysis leads to stability and robustness of time-varying trajectories - for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. For each model, we propose a contractivity test based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum indegree, and the maximum synaptic strength. Then, we show that the models satisfy Dale's Principle. Finally, we illustrate the effectiveness of our results via a numerical example. (c) 2024 Elsevier Ltd. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] STABILIZATION OF HEBBIAN NEURAL NETS BY INHIBITORY LEARNING
    EASTON, P
    GORDON, PE
    BIOLOGICAL CYBERNETICS, 1984, 51 (01) : 1 - 9
  • [42] Continuous neural network with windowed Hebbian learning
    M. Fotouhi
    M. Heidari
    M. Sharifitabar
    Biological Cybernetics, 2015, 109 : 321 - 332
  • [43] Markov Chain Hebbian Learning Algorithm With Ternary Synaptic Units
    Kim, Guhyun
    Kornijcuk, Vladimir
    Kim, Dohun
    Kim, Inho
    Kim, Jaewook
    Woo, Hyo Cheon
    Kim, Jihun
    Hwang, Cheol Seong
    Jeong, Doo Seok
    IEEE ACCESS, 2019, 7 : 10208 - 10223
  • [44] Continuous neural network with windowed Hebbian learning
    Fotouhi, M.
    Heidari, M.
    Sharifitabar, M.
    BIOLOGICAL CYBERNETICS, 2015, 109 (03) : 321 - 332
  • [45] Dual synaptic plasticity in the hippocampus: Hebbian and spatiotemporal learning dynamics
    Kimitaka Kaneki
    Osamu Araki
    Minoru Tsukada
    Cognitive Neurodynamics, 2009, 3 : 153 - 163
  • [46] Prototype Analysis in Hopfield Networks With Hebbian Learning
    Mcalister, Hayden
    Robins, Anthony
    Szymanski, Lech
    NEURAL COMPUTATION, 2024, 36 (11) : 2322 - 2364
  • [47] Learning in spiking neural networks by reinforcement of stochastic synaptic transmission
    Seung, HS
    NEURON, 2003, 40 (06) : 1063 - 1073
  • [48] STUDY OF A LEARNING ALGORITHM FOR NEURAL NETWORKS WITH DISCRETE SYNAPTIC COUPLINGS
    VICENTE, CJP
    CARRABINA, J
    VALDERRAMA, E
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 1992, 3 (02) : 165 - 176
  • [49] Fast learning without synaptic plasticity in spiking neural networks
    Subramoney, Anand
    Bellec, Guillaume
    Scherr, Franz
    Legenstein, Robert
    Maass, Wolfgang
    SCIENTIFIC REPORTS, 2024, 14 (01)
  • [50] Learning Rule for a Quantum Neural Network Inspired by Hebbian Learning
    Osakabe, Yoshihiro
    Sato, Shigeo
    Akima, Hisanao
    Kinjo, Mitsunaga
    Sakuraba, Masao
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2021, E104D (02) : 237 - 245