Trajectory growth lower bounds for random sparse deep ReLU networks

被引:2
|
作者
Price, Ilan [1 ,2 ]
Tanner, Jared [1 ,2 ]
机构
[1] Univ Oxford, Oxford, England
[2] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/ICMLA52953.2021.00165
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper considers the growth in the length of one-dimensional trajectories as they are passed through random deep ReLU networks. We generalise existing results, providing an alternative, simpler method for lower bounding expected trajectory growth through random networks, for a more general class of weights distributions, including sparsely connected networks. We illustrate this approach by deriving bounds for sparse-Gaussian, sparse-uniform, and sparse-discrete-valued random nets. We prove that trajectory growth can remain exponential in such networks, with the sparsity parameter appearing in the base of the exponent.
引用
收藏
页码:1004 / 1009
页数:6
相关论文
共 50 条
  • [41] Approximation of smooth functionals using deep ReLU networks
    Song, Linhao
    Liu, Ying
    Fan, Jun
    Zhou, Ding-Xuan
    NEURAL NETWORKS, 2023, 166 : 424 - 436
  • [42] RELU DEEP NEURAL NETWORKS AND LINEAR FINITE ELEMENTS
    He, Juncai
    Li, Lin
    Xu, Jinchao
    Zheng, Chunyue
    JOURNAL OF COMPUTATIONAL MATHEMATICS, 2020, 38 (03) : 502 - 527
  • [43] Local Identifiability of Deep ReLU Neural Networks: the Theory
    Bona-Pellissier, Joachim
    Malgouyres, Francois
    Bachoc, Francois
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [44] Universal Reliability Bounds for Sparse Networks
    Romero, Pablo
    IEEE TRANSACTIONS ON RELIABILITY, 2022, 71 (01) : 359 - 369
  • [45] Sparse Selfreducible Sets and Nonuniform Lower Bounds
    Buhrman, Harry
    Torenvliet, Leen
    Unger, Falk
    Vereshchagin, Nikolay
    ALGORITHMICA, 2019, 81 (01) : 179 - 200
  • [46] Lower Bounds for Sparse Oblivious Subspace Embeddings
    Li, Yi
    Liu, Mingmou
    PROCEEDINGS OF THE 41ST ACM SIGMOD-SIGACT-SIGAI SYMPOSIUM ON PRINCIPLES OF DATABASE SYSTEMS (PODS '22), 2022, : 251 - 260
  • [47] Sparse Selfreducible Sets and Nonuniform Lower Bounds
    Harry Buhrman
    Leen Torenvliet
    Falk Unger
    Nikolay Vereshchagin
    Algorithmica, 2019, 81 : 179 - 200
  • [48] Random ReLU Neural Networks as Non-Gaussian Processes
    Parhi, Rahul
    Bohra, Pakshal
    El Biari, Ayoub
    Pourya, Mehrsa
    Unser, Michael
    Journal of Machine Learning Research, 2025, 26
  • [49] Random ReLU Neural Networks as Non-Gaussian Processes
    Parhi, Rahul
    Bohra, Pakshal
    El Biari, Ayoub
    Pourya, Mehrsa
    Unser, Michael
    JOURNAL OF MACHINE LEARNING RESEARCH, 2025, 26 : 1 - 31
  • [50] Lower bounds for merging networks
    Iwata, S
    INFORMATION AND COMPUTATION, 2001, 168 (02) : 187 - 195