Trajectory growth lower bounds for random sparse deep ReLU networks

被引:2
|
作者
Price, Ilan [1 ,2 ]
Tanner, Jared [1 ,2 ]
机构
[1] Univ Oxford, Oxford, England
[2] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/ICMLA52953.2021.00165
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper considers the growth in the length of one-dimensional trajectories as they are passed through random deep ReLU networks. We generalise existing results, providing an alternative, simpler method for lower bounding expected trajectory growth through random networks, for a more general class of weights distributions, including sparsely connected networks. We illustrate this approach by deriving bounds for sparse-Gaussian, sparse-uniform, and sparse-discrete-valued random nets. We prove that trajectory growth can remain exponential in such networks, with the sparsity parameter appearing in the base of the exponent.
引用
收藏
页码:1004 / 1009
页数:6
相关论文
共 50 条
  • [21] Lower Bounds for Sparse Recovery
    Do Ba, Khanh
    Indyk, Piotr
    Price, Eric
    Woodruff, David P.
    PROCEEDINGS OF THE TWENTY-FIRST ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2010, 135 : 1190 - +
  • [22] Lower Bounds for Adaptive Sparse Recovery
    Price, Eric
    Woodruff, David P.
    PROCEEDINGS OF THE TWENTY-FOURTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS (SODA 2013), 2013, : 652 - 663
  • [23] Lower Bounds for Sparse Quadratic Forms
    van de Geer, Sara
    ESTIMATION AND TESTING UNDER SPARSITY: ECOLE D'ETE DE PROBABILITES DE SAINT-FLOUR XLV - 2015, 2016, 2159 : 223 - 231
  • [24] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [25] Nearly Optimal Learning Using Sparse Deep ReLU Networks in Regularized Empirical Risk Minimization With Lipschitz Loss
    Huang, Ke
    Liu, Mingming
    Ma, Shujie
    NEURAL COMPUTATION, 2025, 37 (04) : 815 - 870
  • [26] Vanishing Curvature in Randomly Initialized Deep ReLU Networks
    Orvieto, Antonio
    Kohler, Jonas
    Pavllo, Dario
    Hofmann, Thomas
    Lucchi, Aurelien
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [27] Sum-of-Squares Lower Bounds for Independent Set on Ultra-Sparse Random Graphs
    Kothari, Pravesh K.
    Potechin, Aaron
    Xu, Jeff
    PROCEEDINGS OF THE 56TH ANNUAL ACM SYMPOSIUM ON THEORY OF COMPUTING, STOC 2024, 2024, : 1923 - 1934
  • [28] Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Song, Linhao
    Fan, Jun
    Chen, Di-Rong
    Zhou, Ding-Xuan
    JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2023, 29 (04)
  • [29] Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Linhao Song
    Jun Fan
    Di-Rong Chen
    Ding-Xuan Zhou
    Journal of Fourier Analysis and Applications, 2023, 29
  • [30] A generative model for fBm with deep ReLU neural networks
    Allouche, Michaël
    Girard, Stéphane
    Gobet, Emmanuel
    Journal of Complexity, 2022, 73