Mean-field theory and synchronization in random recurrent neural networks

被引:8
|
作者
Dauce, E
Moynot, O
Pinaud, O
Samuelides, M
机构
[1] DTIM, ONERA Ctr Toulouse, F-31055 Toulouse, France
[2] Univ Toulouse 3, LSP, F-31062 Toulouse, France
[3] Univ Toulouse 3, UMR 5640, F-31062 Toulouse, France
[4] ENSAE, F-31055 Toulouse, France
[5] Fac Sci Sport, F-13288 Marseille 09, France
关键词
asymmetric networks; chaos; mean field theory; stochastic dynamics; synchronization; two-population models;
D O I
10.1023/A:1012435207437
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we first present a new mathematical approach, based on large deviation techniques, for the study of a large random recurrent neural network with discrete time dynamics. In particular, we state a mean field property and a law of large numbers, in the most general case of random models with sparse connections and several populations. Our results are supported by rigorous proofs. Then, we focus our interest on large size dynamics, in the case of a model with excitatory and inhibitory populations. The study of the mean field system and of the divergence of individual trajectories allows to define different dynamical regimes in the macroscopic parameters space, which include chaos and collective synchronization phenomenons. At last, we look at the behavior of a particular finite-size system submitted to gaussian static inputs. The system adapts its dynamics to the input signal, and spontaneously produces dynamical transitions from asynchronous to synchronous behaviors, which correspond to the crossing of a bifurcation line in the macroscopic parameters space.
引用
收藏
页码:115 / 126
页数:12
相关论文
共 50 条
  • [1] Mean-field Theory and Synchronization in Random Recurrent Neural Networks
    Emmanuel Dauce
    Olivier Moynot
    Olivier Pinaud
    Manuel Samuelides
    [J]. Neural Processing Letters, 2001, 14 : 115 - 126
  • [2] Large deviations and mean-field theory for asymmetric random recurrent neural networks
    Olivier Moynot
    Manuel Samuelides
    [J]. Probability Theory and Related Fields, 2002, 123 : 41 - 75
  • [3] Large deviations and mean-field theory for asymmetric random recurrent neural networks
    Moynot, O
    Samuelides, M
    [J]. PROBABILITY THEORY AND RELATED FIELDS, 2002, 123 (01) : 41 - 75
  • [4] Mean-field theory of fluid neural networks
    Delgado, J
    Sole, RV
    [J]. PHYSICAL REVIEW E, 1998, 57 (02) : 2204 - 2211
  • [5] Coherence Resonance in Random Erdos-Renyi Neural Networks: Mean-Field Theory
    Hutt, A.
    Wahl, T.
    Voges, N.
    Hausmann, Jo
    Lefebvre, J.
    [J]. FRONTIERS IN APPLIED MATHEMATICS AND STATISTICS, 2021, 7
  • [6] Beyond dynamical mean-field theory of neural networks
    Massimiliano Muratori
    Bruno Cessac
    [J]. BMC Neuroscience, 14 (Suppl 1)
  • [7] Complete Mean-Field Theory for Dynamics of Binary Recurrent Networks
    Farkhooi, Farzad
    Stannat, Wilhelm
    [J]. PHYSICAL REVIEW LETTERS, 2017, 119 (20)
  • [8] Mean-field theory for scale-free random networks
    Barabási, AL
    Albert, R
    Jeong, H
    [J]. PHYSICA A, 1999, 272 (1-2): : 173 - 187
  • [9] Mean-field theory of graph neural networks in graph partitioning
    Kawamoto, Tatsuro
    Tsubaki, Masashi
    Obuchi, Tomoyuki
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2019, 2019 (12):
  • [10] Mean-field theory of graph neural networks in graph partitioning
    Kawamoto, Tatsuro
    Tsubaki, Masashi
    Obuchi, Tomoyuki
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31