Nonlinear Approximation and (Deep) ReLU Networks

被引:0
|
作者
Daubechies, I. [1 ]
DeVore, R. [2 ]
Foucart, S. [2 ]
Hanin, B. [2 ,3 ,4 ]
Petrova, G. [2 ]
机构
[1] Duke Univ, Dept Math, Durham, NC 27708 USA
[2] Texas A&M Univ, Dept Math, College Stn, TX 77843 USA
[3] Facebook AI Res, New York, NY USA
[4] Princeton Univ, Dept Operat Res & Financial Engn, Sherrerd Hall,Charlton St, Princeton, NJ 08544 USA
关键词
Neural networks; Rectified linear unit (ReLU); Expressiveness; Approximation power; NEURAL-NETWORKS; GAME; GO;
D O I
10.1007/s00365-021-09548-z
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This article is concerned with the approximation and expressive powers of deep neural networks. This is an active research area currently producing many interesting papers. The results most commonly found in the literature prove that neural networks approximate functions with classical smoothness to the same accuracy as classical linear methods of approximation, e.g., approximation by polynomials or by piecewise polynomials on prescribed partitions. However, approximation by neural networks depending on n parameters is a form of nonlinear approximation and as such should be compared with other nonlinear methods such as variable knot splines or n-term approximation from dictionaries. The performance of neural networks in targeted applications such as machine learning indicate that they actually possess even greater approximation power than these traditional methods of nonlinear approximation. The main results of this article prove that this is indeed the case. This is done by exhibiting large classes of functions which can be efficiently captured by neural networks where classical nonlinear methods fall short of the task. The present article purposefully limits itself to studying the approximation of univariate functions by ReLU networks. Many generalizations to functions of several variables and other activation functions can be envisioned. However, even in this simplest of settings considered here, a theory that completely quantifies the approximation power of neural networks is still lacking.
引用
收藏
页码:127 / 172
页数:46
相关论文
共 50 条
  • [1] Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Linhao Song
    Jun Fan
    Di-Rong Chen
    Ding-Xuan Zhou
    [J]. Journal of Fourier Analysis and Applications, 2023, 29
  • [2] Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Song, Linhao
    Fan, Jun
    Chen, Di-Rong
    Zhou, Ding-Xuan
    [J]. JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2023, 29 (04)
  • [3] Correction: Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Linhao Song
    Jun Fan
    Di-Rong Chen
    Ding-Xuan Zhou
    [J]. Journal of Fourier Analysis and Applications, 2023, 29
  • [4] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    [J]. 2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [5] Approximation of Nonlinear Functionals Using Deep ReLU Networks (vol 29, 50, 2023)
    Song, Linhao
    Fan, Jun
    Chen, Di-Rong
    Zhou, Ding-Xuan
    [J]. JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2023, 29 (05)
  • [6] Approximation of smooth functionals using deep ReLU networks
    Song, Linhao
    Liu, Ying
    Fan, Jun
    Zhou, Ding-Xuan
    [J]. NEURAL NETWORKS, 2023, 166 : 424 - 436
  • [7] Deep ReLU neural networks in high-dimensional approximation
    Dung, Dinh
    Nguyen, Van Kien
    [J]. NEURAL NETWORKS, 2021, 142 : 619 - 635
  • [8] On the uniform approximation estimation of deep ReLU networks via frequency decomposition
    Chen, Liang
    Liu, Wenjun
    [J]. AIMS MATHEMATICS, 2022, 7 (10): : 19018 - 19025
  • [9] Approximation in shift-invariant spaces with deep ReLU neural networks
    Yang, Yunfei
    Li, Zhen
    Wang, Yang
    [J]. NEURAL NETWORKS, 2022, 153 : 269 - 281
  • [10] Efficient Approximation of Deep ReLU Networks for Functions on Low Dimensional Manifolds
    Chen, Minshuo
    Jiang, Haoming
    Liao, Wenjing
    Zhao, Tuo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32