Trajectory growth lower bounds for random sparse deep ReLU networks

被引:2
|
作者
Price, Ilan [1 ,2 ]
Tanner, Jared [1 ,2 ]
机构
[1] Univ Oxford, Oxford, England
[2] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/ICMLA52953.2021.00165
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper considers the growth in the length of one-dimensional trajectories as they are passed through random deep ReLU networks. We generalise existing results, providing an alternative, simpler method for lower bounding expected trajectory growth through random networks, for a more general class of weights distributions, including sparsely connected networks. We illustrate this approach by deriving bounds for sparse-Gaussian, sparse-uniform, and sparse-discrete-valued random nets. We prove that trajectory growth can remain exponential in such networks, with the sparsity parameter appearing in the base of the exponent.
引用
收藏
页码:1004 / 1009
页数:6
相关论文
共 50 条
  • [1] New Error Bounds for Deep ReLU Networks Using Sparse Grids
    Montanelli, Hadrien
    Du, Qiang
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2019, 1 (01): : 78 - 92
  • [2] Error bounds for approximations with deep ReLU networks
    Yarotsky, Dmitry
    NEURAL NETWORKS, 2017, 94 : 103 - 114
  • [3] Towards Lower Bounds on the Depth of ReLU Neural Networks
    Hertrich, Christoph
    Basu, Amitabh
    Di Summa, Marco
    Skutella, Martin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [4] TOWARDS LOWER BOUNDS ON THE DEPTH OF RELU NEURAL NETWORKS*
    Hertrich, Christoph
    Basu, Amitabh
    Di Summa, Marco
    Skutella, Martin
    SIAM JOURNAL ON DISCRETE MATHEMATICS, 2023, 37 (02) : 997 - 1029
  • [5] LOWER BOUNDS ON THE CAPACITIES OF BINARY AND TERNARY NETWORKS STORING SPARSE RANDOM VECTORS
    BARAM, Y
    SALEE, D
    IEEE TRANSACTIONS ON INFORMATION THEORY, 1992, 38 (06) : 1633 - 1647
  • [6] On the Error Bounds for ReLU Neural Networks
    Katende, Ronald
    Kasumba, Henry
    Kakuba, Godwin
    Mango, John
    IAENG International Journal of Applied Mathematics, 2024, 54 (12) : 2602 - 2611
  • [7] Error bounds for approximations with deep ReLU neural networks in Ws,p norms
    Guehring, Ingo
    Kutyniok, Gitta
    Petersen, Philipp
    ANALYSIS AND APPLICATIONS, 2020, 18 (05) : 803 - 859
  • [8] Tight Bounds on the Smallest Eigenvalue of the Neural Tangent Kernel for Deep ReLU Networks
    Nguyen, Quynh
    Mondelli, Marco
    Montufar, Guido
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [9] SQ Lower Bounds for Random Sparse Planted Vector Problem
    Ding, Jingqiu
    Hua, Yiding
    INTERNATIONAL CONFERENCE ON ALGORITHMIC LEARNING THEORY, VOL 201, 2023, 201 : 558 - 596
  • [10] Convergence of deep ReLU networks
    Xu, Yuesheng
    Zhang, Haizhang
    NEUROCOMPUTING, 2024, 571