Trajectory growth lower bounds for random sparse deep ReLU networks

被引:2
|
作者
Price, Ilan [1 ,2 ]
Tanner, Jared [1 ,2 ]
机构
[1] Univ Oxford, Oxford, England
[2] Alan Turing Inst, London, England
基金
英国工程与自然科学研究理事会;
关键词
D O I
10.1109/ICMLA52953.2021.00165
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper considers the growth in the length of one-dimensional trajectories as they are passed through random deep ReLU networks. We generalise existing results, providing an alternative, simpler method for lower bounding expected trajectory growth through random networks, for a more general class of weights distributions, including sparsely connected networks. We illustrate this approach by deriving bounds for sparse-Gaussian, sparse-uniform, and sparse-discrete-valued random nets. We prove that trajectory growth can remain exponential in such networks, with the sparsity parameter appearing in the base of the exponent.
引用
收藏
页码:1004 / 1009
页数:6
相关论文
共 50 条
  • [31] Unboundedness of Linear Regions of Deep ReLU Neural Networks
    Ponomarchuk, Anton
    Koutschan, Christoph
    Moser, Bernhard
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2022 WORKSHOPS, 2022, 1633 : 3 - 10
  • [32] On a Fitting of a Heaviside Function by Deep ReLU Neural Networks
    Hagiwara, Katsuyuki
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 59 - 69
  • [33] A generative model for fBm with deep ReLU neural networks
    Allouche, Michael
    Girard, Stephane
    Gobet, Emmanuel
    JOURNAL OF COMPLEXITY, 2022, 73
  • [34] Convergence rates of deep ReLU networks for multiclass classification
    Bos, Thijs
    Schmidt-Hieber, Johannes
    ELECTRONIC JOURNAL OF STATISTICS, 2022, 16 (01): : 2724 - 2773
  • [35] Decision Boundary Formation of Deep Convolution Networks with ReLU
    Lee, C.
    Woo, S.
    2018 16TH IEEE INT CONF ON DEPENDABLE, AUTONOM AND SECURE COMP, 16TH IEEE INT CONF ON PERVAS INTELLIGENCE AND COMP, 4TH IEEE INT CONF ON BIG DATA INTELLIGENCE AND COMP, 3RD IEEE CYBER SCI AND TECHNOL CONGRESS (DASC/PICOM/DATACOM/CYBERSCITECH), 2018, : 885 - 888
  • [36] Towards Quantifying Intrinsic Generalization of Deep ReLU Networks
    Salman, Shaeke
    Zhang, Canlin
    Liu, Xiuwen
    Mio, Washington
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [37] Sparse bounds for oscillatory and random singular integrals
    Lacey, Michael T.
    Spencer, Scott
    NEW YORK JOURNAL OF MATHEMATICS, 2017, 23 : 119 - 131
  • [38] PERFORMANCE BOUNDS FOR SPARSE ESTIMATION WITH RANDOM NOISE
    Ben-Haim, Zvika
    Eldar, Yonina C.
    2009 IEEE/SP 15TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2009, : 225 - 228
  • [39] Generalization bounds for sparse random feature expansions ?
    Hashemi, Abolfazl
    Schaeffer, Hayden
    Shi, Robert
    Topcu, Ufuk
    Tran, Giang
    Ward, Rachel
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2023, 62 : 310 - 330
  • [40] Adversarial Examples in Multi-Layer Random ReLU Networks
    Bartlett, Peter L.
    Bubeck, Sebastien
    Cherapanamjeri, Yeshwanth
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34