Asymptotic Description of Neural Networks with Correlated Synaptic Weights

被引:7
|
作者
Faugeras, Olivier [1 ]
MacLaurin, James [1 ]
机构
[1] INRIA Sophia Antipolis Mediterannee, F-06410 Sophia Antipolis, France
关键词
large deviations; good rate function; stationary gaussian processes; stationary measures; spectral representations; neural networks; firing rate neurons; correlated synaptic weights; RANDOMLY ASYMMETRIC BONDS; MEAN-FIELD THEORY; LARGE DEVIATIONS; SPIN SYSTEMS; DYNAMICS; EXPANSION; MODEL; TIME;
D O I
10.3390/e17074701
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories.
引用
收藏
页码:4701 / 4743
页数:43
相关论文
共 50 条
  • [41] Learning Neural Networks without Lazy Weights
    Lee, Dong-gi
    Cho, Junhee
    Kim, Myungjun
    Park, Sunghong
    Shin, Hyunjung
    2022 IEEE INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (IEEE BIGCOMP 2022), 2022, : 82 - 87
  • [42] Determination of weights for relaxation recurrent neural networks
    Serpen, G
    Livingston, DL
    NEUROCOMPUTING, 2000, 34 : 145 - 168
  • [43] Neural Networks Between Integer and Rational Weights
    Sima, Jiri
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 154 - 161
  • [44] Hardness of Learning Neural Networks with Natural Weights
    Daniely, Amit
    Vardi, Gal
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [45] NEURAL NETWORKS WITH UNIPOLAR WEIGHTS AND NORMALIZED THRESHOLDS
    BRODKA, JS
    MACUKOW, B
    OPTICAL COMPUTING, 1995, 139 : 463 - 466
  • [46] Stochastic Weights Binary Neural Networks on FPGA
    Fukuda, Yasushi
    Kawahara, Takayuki
    2018 7TH IEEE INTERNATIONAL SYMPOSIUM ON NEXT-GENERATION ELECTRONICS (ISNE), 2018, : 220 - 222
  • [47] Neural Networks with Superexpressive Activations and Integer Weights
    Beknazaryan, Aleksandr
    INTELLIGENT COMPUTING, VOL 2, 2022, 507 : 445 - 451
  • [48] A novel method to compute the weights of neural networks
    Gao, Zhentao
    Chen, Yuanyuan
    Yi, Zhang
    NEUROCOMPUTING, 2020, 407 : 409 - 427
  • [49] A Convolutional Accelerator for Neural Networks With Binary Weights
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2018 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2018,