RANDOM NEURAL NETWORKS IN THE INFINITE WIDTH LIMIT AS GAUSSIAN PROCESSES

被引:7
|
作者
Hanin, Boris [1 ]
机构
[1] Princeton Univ, Dept Operat Res & Financial Engn, Princeton, NJ 08544 USA
来源
ANNALS OF APPLIED PROBABILITY | 2023年 / 33卷 / 6A期
关键词
Neural networks; Gaussian processes; limit theorems; PRODUCTS;
D O I
10.1214/23-AAP1933
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
This article gives a new proof that fully connected neural networks with random weights and biases converge to Gaussian processes in the regime where the input dimension, output dimension, and depth are kept fixed, while the hidden layer widths tend to infinity. Unlike prior work, convergence is shown assuming only moment conditions for the distribution of weights and for quite general nonlinearities.
引用
收藏
页码:4798 / 4819
页数:22
相关论文
共 50 条