Contraction Analysis of Hopfield Neural Networks with Hebbian Learning

被引:3
|
作者
Centorrino, Veronica [1 ]
Bullo, Francesco [2 ,3 ]
Russo, Giovanni [4 ]
机构
[1] Univ Naples Federico II, Scuola Super Meridionale, Naples, Italy
[2] Univ Calif Santa Barbara, Ctr Control Dynam Syst & Computat, Santa Barbara, CA USA
[3] Univ Calif Santa Barbara, Dept Mech Engn, Santa Barbara, CA USA
[4] Univ Salerno, Dept Informat & Elec tric Engn & Appl Math, Salerno, Italy
关键词
D O I
10.1109/CDC51059.2022.9993009
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Motivated by advances in neuroscience and machine learning, this paper is concerned with the modeling and analysis of Hopfield neural networks with dynamic recurrent connections undergoing Hebbian learning. To capture the synaptic sparsity of neural circuits, we propose a low dimensional formulation for the model and then characterize its key dynamical properties. First, we give a biologically-inspired forward invariance result. Then, we give sufficient conditions for the non-Euclidean contractivity of the model. Our contraction analysis leads to stability and robustness of time-varying trajectories - for networks with both excitatory and inhibitory synapses governed by both Hebbian and anti-Hebbian rules. Our proposed contractivity test is based upon biologically meaningful quantities, e.g., neural and synaptic decay rate, maximum out-degree, and the maximum synaptic strength. Finally, we show that the model satisfies Dale's principle. The effectiveness of our results is illustrated via a numerical example.
引用
收藏
页码:622 / 627
页数:6
相关论文
共 50 条
  • [31] On Hopfield neural networks
    Feild Jr., William B.
    Navlakha, Jainendra K.
    1600, (01):
  • [32] Learning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks
    Panda, Priyadarshini
    Roy, Kaushik
    FRONTIERS IN NEUROSCIENCE, 2017, 11
  • [33] Mapping Hebbian Learning Rules to Coupling Resistances for Oscillatory Neural Networks
    Delacour, Corentin
    Todri-Sanial, Aida
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [34] Dense Hebbian neural networks: A replica symmetric picture of supervised learning
    Agliari E.
    Albanese L.
    Alemanno F.
    Alessandrelli A.
    Barra A.
    Giannotti F.
    Lotito D.
    Pedreschi D.
    Physica A: Statistical Mechanics and its Applications, 2023, 626
  • [35] Hebbian learning using fixed weight evolved dynamical 'Neural' networks
    Izquierdo-Torres, Eduardo
    Harvey, Inman
    2007 IEEE SYMPOSIUM ON ARTIFICIAL LIFE, 2006, : 394 - +
  • [36] Dense Hebbian neural networks: A replica symmetric picture of unsupervised learning
    Agliari, Elena
    Albanese, Linda
    Alemanno, Francesco
    Alessandrelli, Andrea
    Barra, Adriano
    Giannotti, Fosca
    Lotito, Daniele
    Pedreschi, Dino
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2023, 627
  • [37] Utilizing Hopfield neural networks in the analysis of reluctance motors
    Adly, AA
    Abd-El-Hafiz, SK
    IEEE TRANSACTIONS ON MAGNETICS, 2000, 36 (05) : 3147 - 3149
  • [38] Stability analysis of Hopfield neural networks with time delay
    Wang, LS
    Xu, DY
    APPLIED MATHEMATICS AND MECHANICS-ENGLISH EDITION, 2002, 23 (01) : 65 - 70
  • [39] H∞ stability analysis for delayed Hopfield neural networks
    Ahn, C. K.
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART I-JOURNAL OF SYSTEMS AND CONTROL ENGINEERING, 2010, 224 (I2) : 203 - 208
  • [40] Stability analysis of continuous Hopfield neural networks with delay
    Cong, J
    Wang, SH
    PROCEEDINGS OF THE 2005 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND BRAIN, VOLS 1-3, 2005, : 573 - 575