Asymptotic Description of Neural Networks with Correlated Synaptic Weights

被引:7
|
作者
Faugeras, Olivier [1 ]
MacLaurin, James [1 ]
机构
[1] INRIA Sophia Antipolis Mediterannee, F-06410 Sophia Antipolis, France
关键词
large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights; RANDOMLY ASYMMETRIC BONDS; MEAN-FIELD THEORY; LARGE DEVIATIONS; SPIN SYSTEMS; DYNAMICS; EXPANSION; MODEL; TIME;
D O I
10.3390/e17074701
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.
引用
收藏
页码:4701 / 4743
页数:43
相关论文
共 50 条
  • [31] Distributed synaptic weights in a LIF neural network and learning rules
    Perthame, Benoit
    Salort, Delphine
    Wainrib, Gilles
    PHYSICA D-NONLINEAR PHENOMENA, 2017, 353 : 20 - 30
  • [32] Asymptotic Properties of a Dynamic Neural System with Asymmetric Connection Weights
    鄢克雨
    钟守铭
    杨金祥
    Journal of Electronic Science and Technology of China, 2005, (01) : 78 - 81
  • [33] Asymptotic description of stochastic neural networks. I. Existence of a large deviation principle
    Faugeras, Olivier
    Maclaurin, James
    COMPTES RENDUS MATHEMATIQUE, 2014, 352 (10) : 841 - 846
  • [34] Global Asymptotic Stability and Stabilization of Long Short-Term Memory Neural Networks with Constant Weights and Biases
    Deka, Shankar A.
    Stipanovic, Dusan M.
    Murmann, Boris
    Tomlin, Claire J.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2019, 181 (01) : 231 - 243
  • [35] Quantized Neural Networks: Training Neural Networks with Low Precision Weights and Activations
    Hubara, Itay
    Courbariaux, Matthieu
    Soudry, Daniel
    El-Yaniv, Ran
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18
  • [36] Global Asymptotic Stability and Stabilization of Long Short-Term Memory Neural Networks with Constant Weights and Biases
    Shankar A. Deka
    Dušan M. Stipanović
    Boris Murmann
    Claire J. Tomlin
    Journal of Optimization Theory and Applications, 2019, 181 : 231 - 243
  • [37] Synaptic metaplasticity in binarized neural networks
    Axel Laborieux
    Maxence Ernoult
    Tifenn Hirtzlin
    Damien Querlioz
    Nature Communications, 12
  • [38] Synaptic metaplasticity in binarized neural networks
    Laborieux, Axel
    Ernoult, Maxence
    Hirtzlin, Tifenn
    Querlioz, Damien
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [39] Euclidean Contractivity of Neural Networks With Symmetric Weights
    Centorrino, Veronica
    Gokhale, Anand
    Davydov, Alexander
    Russo, Giovanni
    Bullo, Francesco
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 1724 - 1729
  • [40] Routes to chaos in neural networks with random weights
    Albers, DJ
    Sprott, JC
    Dechert, WD
    INTERNATIONAL JOURNAL OF BIFURCATION AND CHAOS, 1998, 8 (07): : 1463 - 1478