Mean-field Theory and Synchronization in Random Recurrent Neural Networks

被引:0
|
作者
Emmanuel Dauce
Olivier Moynot
Olivier Pinaud
Manuel Samuelides
机构
[1] DTIM,ONERA Centre de Toulouse
[2] Université Paul Sabatier,LSP
[3] Université Paul Sabatier,Mathématiques pour l'Industrie et la Physique (UMR 5640)
[4] ENSAE,Mouvement et Perception
[5] Faculté des sciences du sport,undefined
来源
Neural Processing Letters | 2001年 / 14卷
关键词
asymmetric networks; chaos; mean field theory; stochastic dynamics; synchronization; two-population models;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, we first present a new mathematical approach, based on large deviation techniques, for the study of a large random recurrent neural network with discrete time dynamics. In particular, we state a mean field property and a law of large numbers, in the most general case of random models with sparse connections and several populations. Our results are supported by rigorous proofs. Then, we focus our interest on large size dynamics, in the case of a model with excitatory and inhibitory populations. The study of the mean field system and of the divergence of individual trajectories allows to define different dynamical regimes in the macroscopic parameters space, which include chaos and collective synchronization phenomenons. At last, we look at the behavior of a particular finite-size system submitted to gaussian static inputs. The system adapts its dynamics to the input signal, and spontaneously produces dynamical transitions from asynchronous to synchronous behaviors, which correspond to the crossing of a bifurcation line in the macroscopic parameters space.
引用
收藏
页码:115 / 126
页数:11
相关论文
共 50 条
  • [1] Mean-field theory and synchronization in random recurrent neural networks
    Dauce, E
    Moynot, O
    Pinaud, O
    Samuelides, M
    [J]. NEURAL PROCESSING LETTERS, 2001, 14 (02) : 115 - 126
  • [2] Large deviations and mean-field theory for asymmetric random recurrent neural networks
    Olivier Moynot
    Manuel Samuelides
    [J]. Probability Theory and Related Fields, 2002, 123 : 41 - 75
  • [3] Large deviations and mean-field theory for asymmetric random recurrent neural networks
    Moynot, O
    Samuelides, M
    [J]. PROBABILITY THEORY AND RELATED FIELDS, 2002, 123 (01) : 41 - 75
  • [4] Mean-field theory of fluid neural networks
    Delgado, J
    Sole, RV
    [J]. PHYSICAL REVIEW E, 1998, 57 (02) : 2204 - 2211
  • [5] Coherence Resonance in Random Erdos-Renyi Neural Networks: Mean-Field Theory
    Hutt, A.
    Wahl, T.
    Voges, N.
    Hausmann, Jo
    Lefebvre, J.
    [J]. FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 7
  • [6] Beyond dynamical mean-field theory of neural networks
    Massimiliano Muratori
    Bruno Cessac
    [J]. BMC Neuroscience, 14 (Suppl 1)
  • [7] Complete Mean-Field Theory for Dynamics of Binary Recurrent Networks
    Farkhooi, Farzad
    Stannat, Wilhelm
    [J]. PHYSICAL REVIEW LETTERS, 2017, 119 (20)
  • [8] Mean-field theory for scale-free random networks
    Barabási, AL
    Albert, R
    Jeong, H
    [J]. PHYSICA A, 1999, 272 (1-2): : 173 - 187
  • [9] Mean-field theory of graph neural networks in graph partitioning
    Kawamoto, Tatsuro
    Tsubaki, Masashi
    Obuchi, Tomoyuki
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [10] Mean-field theory of graph neural networks in graph partitioning
    Kawamoto, Tatsuro
    Tsubaki, Masashi
    Obuchi, Tomoyuki
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31