Error bounds for approximations with deep ReLU neural networks in Ws,p norms

被引:89
|
作者
Guehring, Ingo [1 ]
Kutyniok, Gitta [1 ,2 ,3 ]
Petersen, Philipp [4 ]
机构
[1] Tech Univ Berlin, Inst Math, Berlin, Germany
[2] Tech Univ Berlin, Dept Comp Sci & Elect Engn, Berlin, Germany
[3] Univ Tromso, Dept Phys & Technol, Tromso, Norway
[4] Univ Oxford, Math Inst, Oxford, England
关键词
Deep neural networks; approximation rates; Sobolev spaces; PDEs; curse of dimension; ALGORITHM; SMOOTH;
D O I
10.1142/S0219530519410021
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We analyze to what extent deep Rectified Linear Unit (ReLU) neural networks can efficiently approximate Sobolev regular functions if the approximation error is measured with respect to weaker Sobolev norms. In this context, we first establish upper approximation bounds by ReLU neural networks for Sobolev regular functions by explicitly constructing the approximate ReLU neural networks. Then, we establish lower approximation bounds for the same type of function classes. A trade-off between the regularity used in the approximation norm and the complexity of the neural network can be observed in upper and lower bounds. Our results extend recent advances in the approximation theory of ReLU networks to the regime that is most relevant for applications in the numerical analysis of partial differential equations.
引用
收藏
页码:803 / 859
页数:57
相关论文
共 50 条
  • [41] Mapping Distributional Semantics to Property Norms with Deep Neural Networks
    Li, Dandan
    Summers-Stay, Douglas
    BIG DATA AND COGNITIVE COMPUTING, 2019, 3 (02) : 1 - 11
  • [42] Hoffman's error bounds and uniform Lipschitz continuity of best l(p)-approximations
    Berens, H
    Finzel, M
    Li, W
    Xu, Y
    JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 1997, 213 (01) : 183 - 201
  • [43] A Framework for the Construction of Upper Bounds on the Number of Affine Linear Regions of ReLU Feed-Forward Neural Networks
    Hinz, Peter
    van de Geer, Sara
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2019, 65 (11) : 7304 - 7324
  • [44] Generalization error bounds for iterative recovery algorithms unfolded as neural networks
    Schnoor, Ekkehard
    Behboodi, Arash
    Rauhut, Holger
    INFORMATION AND INFERENCE-A JOURNAL OF THE IMA, 2023, 12 (03)
  • [45] On the turnpike to design of deep neural networks: Explicit depth bounds
    Faulwasser, Timm
    Hempel, Arne-Jens
    Streif, Stefan
    IFAC JOURNAL OF SYSTEMS AND CONTROL, 2024, 30
  • [46] ERROR ANALYSIS FOR DEEP NEURAL NETWORK APPROXIMATIONS OF PARAMETRIC HYPERBOLIC CONSERVATION LAWS
    DE Ryck, T.
    Mishra, S.
    MATHEMATICS OF COMPUTATION, 2024, 93 (350) : 2643 - 2677
  • [47] Full error analysis for the training of deep neural networks
    Beck, Christian
    Jentzen, Arnulf
    Kuckuck, Benno
    INFINITE DIMENSIONAL ANALYSIS QUANTUM PROBABILITY AND RELATED TOPICS, 2022, 25 (02)
  • [48] Layout Error Correction using Deep Neural Networks
    Mohan, Srie Raam
    Bukhari, Syed Saqib
    Dengel, Andreas
    2018 13TH IAPR INTERNATIONAL WORKSHOP ON DOCUMENT ANALYSIS SYSTEMS (DAS), 2018, : 299 - 304
  • [49] VIDEO ERROR CONCEALMENT USING DEEP NEURAL NETWORKS
    Sankisa, Arun
    Punjabi, Arjun
    Katsaggelos, Aggelos K.
    2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 380 - 384
  • [50] WHY RELU UNITS SOMETIMES DIE: ANALYSIS OF SINGLE-UNIT ERROR BAC ROPAGATION IN NEURAL NETWORKS
    Douglas, Scott C.
    Yu, Jiutian
    2018 CONFERENCE RECORD OF 52ND ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2018, : 864 - 868