Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev and Besov Spaces

被引:0
|
作者
Siegel, Jonathan W. [1 ]
机构
[1] Texas A&M Univ, Dept Math, College Stn, TX 77843 USA
基金
美国国家科学基金会;
关键词
COMPRESSION; BOUNDS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Let Omega=[0,1](d) be the unit cube in R-d. We study the problem of how efficiently, in terms of the number of parameters, deep neural networks with the ReLU activation function can approximate functions in the Sobolev spaces W-s(L-q(Omega)) and Besov spaces B-r(s)(L-q(Omega)), with error measured in the L-p(Omega) norm. This problem is important when studying the application of neural networks in a variety of fields, including scientific computing and signal processing, and has previously been solved only when p=q=infinity. Our contribution is to provide a complete solution for all 1 <= p,q <=infinity and s>0 for which the corresponding Sobolev or Besov space compactly embeds into L-p. The key technical tool is a novel bit-extraction technique which gives an optimal encoding of sparse vectors. This enables us to obtain sharp upper bounds in the non-linear regime where p>q. We also provide a novel method for deriving Lp-approximation lower bounds based upon VC-dimension when p<infinity. Our results show that very deep ReLU networks significantly outperform classical methods of approximation in terms of the number of parameters, but that this comes at the cost of parameters which are not encodable.
引用
收藏
页数:52
相关论文
共 50 条
  • [41] RELU DEEP NEURAL NETWORKS AND LINEAR FINITE ELEMENTS
    He, Juncai
    Li, Lin
    Xu, Jinchao
    Zheng, Chunyue
    [J]. JOURNAL OF COMPUTATIONAL MATHEMATICS, 2020, 38 (03) : 502 - 527
  • [42] Local Identifiability of Deep ReLU Neural Networks: the Theory
    Bona-Pellissier, Joachim
    Malgouyres, Francois
    Bachoc, Francois
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [43] Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities
    Marcati, Carlo
    Opschoor, Joost A. A.
    Petersen, Philipp C.
    Schwab, Christoph
    [J]. FOUNDATIONS OF COMPUTATIONAL MATHEMATICS, 2023, 23 (03) : 1043 - 1127
  • [44] Low dimensional approximation and generalization of multivariate functionson smooth manifolds using deep ReLU neural networks
    Labate, Demetrio
    Shi, Ji
    [J]. NEURAL NETWORKS, 2024, 174
  • [45] Haar Frame Characterizations of Besov–Sobolev Spaces and Optimal Embeddings into Their Dyadic Counterparts
    Gustavo Garrigós
    Andreas Seeger
    Tino Ullrich
    [J]. Journal of Fourier Analysis and Applications, 2023, 29
  • [46] Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities
    Carlo Marcati
    Joost A. A. Opschoor
    Philipp C. Petersen
    Christoph Schwab
    [J]. Foundations of Computational Mathematics, 2023, 23 : 1043 - 1127
  • [47] Tensor products of Sobolev-Besov spaces and applications to approximation from the hyperbolic cross
    Sickel, Winfried
    Ullrich, Tino
    [J]. JOURNAL OF APPROXIMATION THEORY, 2009, 161 (02) : 748 - 786
  • [48] Nonclosedness of sets of neural networks in Sobolev spaces
    Mahan, Scott
    King, Emily J.
    Cloninger, Alex
    [J]. NEURAL NETWORKS, 2021, 137 : 85 - 96
  • [49] Approximation of functions from Korobov spaces by deep convolutional neural networks
    Tong Mao
    Ding-Xuan Zhou
    [J]. Advances in Computational Mathematics, 2022, 48
  • [50] Approximation of functions from Korobov spaces by deep convolutional neural networks
    Mao, Tong
    Zhou, Ding-Xuan
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2022, 48 (06)