Asymptotic Description of Neural Networks with Correlated Synaptic Weights

被引:7
|
作者
Faugeras, Olivier [1 ]
MacLaurin, James [1 ]
机构
[1] INRIA Sophia Antipolis Mediterannee, F-06410 Sophia Antipolis, France
关键词
large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights; RANDOMLY ASYMMETRIC BONDS; MEAN-FIELD THEORY; LARGE DEVIATIONS; SPIN SYSTEMS; DYNAMICS; EXPANSION; MODEL; TIME;
D O I
10.3390/e17074701
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.
引用
收藏
页码:4701 / 4743
页数:43
相关论文
共 50 条
  • [1] Robustness to Noisy Synaptic Weights in Spiking Neural Networks
    Li, Chen
    Chen, Runze
    Moutafis, Christoforos
    Furber, Steve
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [2] A large deviation principle for networks of rate neurons with correlated synaptic weights
    Olivier Faugeras
    James MacLaurin
    BMC Neuroscience, 14 (Suppl 1)
  • [3] Chaos and Correlated Avalanches in Excitatory Neural Networks with Synaptic Plasticity
    Pittorino, Fabrizio
    Ibanez-Berganza, Miguel
    di Volo, Matteo
    Vezzani, Alessandro
    Burioni, Raffaella
    PHYSICAL REVIEW LETTERS, 2017, 118 (09)
  • [4] Sparse Activations with Correlated Weights in Cortex-Inspired Neural Networks
    Chun, Chanwoo
    Lee, Daniel D.
    CONFERENCE ON PARSIMONY AND LEARNING, VOL 234, 2024, 234 : 248 - 268
  • [5] Co-learning synaptic delays, weights and adaptation in spiking neural networks
    Deckers, Lucas
    Van Damme, Laurens
    Van Leekwijck, Werner
    Tsang, Ing Jyh
    Latre, Steven
    FRONTIERS IN NEUROSCIENCE, 2024, 18
  • [6] Training multi-layer spiking neural networks with plastic synaptic weights and delays
    Wang, Jing
    FRONTIERS IN NEUROSCIENCE, 2024, 17
  • [7] Logic circuits of variable function using neural networks and their method of calculating synaptic weights
    Teramura, M
    Nomiyama, T
    Miyazaki, T
    ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE, 2000, 83 (01): : 14 - 20
  • [8] Logic circuits of variable function using neural networks and their method of calculating synaptic weights
    Teramura, Masahiro
    Nomiyama, Teruaki
    Miyazaki, Tomoyuki
    2000, Scripta Technica Inc, New York (83):
  • [9] Dynamic distribution of synaptic weights in simple networks
    Letellier, M.
    Goda, Y.
    JOURNAL OF NEUROCHEMISTRY, 2012, 122 : 4 - 4
  • [10] On neural networks with minimal weights
    Bohossian, V
    Bruck, J
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 246 - 252