Random ReLU Neural Networks as Non-Gaussian Processes

被引:0
|
作者
Parhi, Rahul [1 ]
Bohra, Pakshal [2 ]
El Biari, Ayoub [2 ]
Pourya, Mehrsa [2 ]
Unser, Michael [2 ]
机构
[1] Department of Electrical and Computer Engineering, University of California, La Jolla, San Diego,CA,92093, United States
[2] Biomedical Imaging Group, École polytechnique fédérale de Lausanne, Lausanne,CH-1015, Switzerland
基金
欧洲研究理事会;
关键词
Differential equations - Gaussian distribution - Gaussian noise (electronic) - Impulse noise - Neural networks - Random processes - Random variables - Stochastic systems;
D O I
暂无
中图分类号
学科分类号
摘要
We consider a large class of shallow neural networks with randomly initialized parameters and rectified linear unit activation functions. We prove that these random neural networks are well-defined non-Gaussian processes. As a by-product, we demonstrate that these networks are solutions to stochastic differential equations driven by impulsive white noise (combinations of random Dirac measures). These processes are parameterized by the law of the weights and biases as well as the density of activation thresholds in each bounded region of the input domain. We prove that these processes are isotropic and wide-sense self-similar with Hurst exponent 3/2. We also derive a remarkably simple closed-form expression for their autocovariance function. Our results are fundamentally different from prior work in that we consider a non-asymptotic viewpoint: The number of neurons in each bounded region of the input domain (i.e., the width) is itself a random variable with a Poisson law with mean proportional to the density parameter. Finally, we show that, under suitable hypotheses, as the expected width tends to infinity, these processes can converge in law not only to Gaussian processes, but also to non-Gaussian processes depending on the law of the weights. Our asymptotic results provide a new take on several classical results (wide networks converge to Gaussian processes) as well as some new ones (wide networks can converge to non-Gaussian processes). ©2025 Rahul Parhi, Pakshal Bohra, Ayoub El Biari, Mehrsa Pourya, and Michael Unser.
引用
收藏
相关论文
共 50 条
  • [31] NON-GAUSSIAN RANDOM-WALKS
    BALL, RC
    HAVLIN, S
    WEISS, GH
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1987, 20 (12): : 4055 - 4059
  • [32] A NON-GAUSSIAN MODEL FOR RANDOM SURFACES
    ADLER, RJ
    FIRMAN, D
    PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 1981, 303 (1479): : 433 - 462
  • [33] RANDOM NEURAL NETWORKS IN THE INFINITE WIDTH LIMIT AS GAUSSIAN PROCESSES
    Hanin, Boris
    ANNALS OF APPLIED PROBABILITY, 2023, 33 (6A): : 4798 - 4819
  • [34] Improved neural component analysis for monitoring nonlinear and Non-Gaussian processes
    Lou, Zhijiang
    Li, Zedong
    Wang, Youqing
    Lu, Shan
    MEASUREMENT, 2022, 195
  • [35] Neural network-aided simulation of non-Gaussian stochastic processes
    Li, Yang
    Xu, Jun
    RELIABILITY ENGINEERING & SYSTEM SAFETY, 2024, 242
  • [36] An artificial neural network model for fatigue damage analysis of wide-band non-Gaussian random processes
    Yuan, Kuilin
    Peng, Shifeng
    Sun, Zhuocheng
    APPLIED OCEAN RESEARCH, 2024, 144
  • [37] NON-GAUSSIAN, NON-MARKOV PROCESSES
    EVANS, MW
    GRIGOLINI, P
    JOURNAL OF THE CHEMICAL SOCIETY-FARADAY TRANSACTIONS II, 1980, 76 : 761 - 766
  • [38] CUMULANT ANALYSIS OF FUNCTIONAL NONLINEAR TRANSFORMATION OF NON-GAUSSIAN RANDOM PROCESSES AND FIELDS
    DUBKOV, AA
    MALAKHOV, AN
    DOKLADY AKADEMII NAUK SSSR, 1975, 222 (04): : 793 - 796
  • [39] Analysis of Moment and Cumulant Description of non-Gaussian Random Processes, Signals and Noise
    Artyushenko, V. M.
    Volovach, V. I.
    PROCEEDINGS OF 2018 IEEE EAST-WEST DESIGN & TEST SYMPOSIUM (EWDTS 2018), 2018,
  • [40] CUMULATIVE SYNTHESIS OF MARKOV-MODELS OF NON-GAUSSIAN RANDOM-PROCESSES
    KONTOROVICH, VY
    IZVESTIYA VYSSHIKH UCHEBNYKH ZAVEDENII RADIOFIZIKA, 1982, 25 (07): : 848 - 849