DEEP RELU NETWORKS OVERCOME THE CURSE OF DIMENSIONALITY FOR GENERALIZED BANDLIMITED FUNCTIONS

被引:17
|
作者
Montanelli, Hadrien [1 ]
Yang, Haizhao [2 ]
Du, Qiang [3 ]
机构
[1] Ecole Polytech, Ctr Math Appl, Palaiseau, France
[2] Purdue Univ, Dept Math, W Lafayette, IN 47907 USA
[3] Columbia Univ, Dept Appl Phys & Appl Math, New York, NY USA
来源
JOURNAL OF COMPUTATIONAL MATHEMATICS | 2021年 / 39卷 / 06期
关键词
Machine learning; Deep ReLU networks; Curse of dimensionality; Approxima-tion theory; Bandlimited functions; Chebyshev polynomials; ERROR-BOUNDS; OPTIMAL APPROXIMATION; SUPERPOSITION; SMOOTH;
D O I
10.4208/jcm.2007-m2019-0239
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome. Our theorem is based on a result by Maurey and on the ability of deep ReLU networks to approximate Chebyshev polynomials and analytic functions efficiently.
引用
收藏
页码:801 / 815
页数:15
相关论文
共 50 条
  • [31] ON DEEP LEARNING AS A REMEDY FOR THE CURSE OF DIMENSIONALITY IN NONPARAMETRIC REGRESSION
    Bauer, Benedikt
    Kohler, Michael
    ANNALS OF STATISTICS, 2019, 47 (04): : 2261 - 2285
  • [32] A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black-Scholes Partial Differential Equations
    Grohs, Philipp
    Hornung, Fabian
    Jentzen, Arnulf
    von Wurstemberger, Philippe
    MEMOIRS OF THE AMERICAN MATHEMATICAL SOCIETY, 2023, 284 (1410) : 1 - 106
  • [33] Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review
    Poggio T.
    Mhaskar H.
    Rosasco L.
    Miranda B.
    Liao Q.
    International Journal of Automation and Computing, 2017, 14 (5) : 503 - 519
  • [34] Approximation of compositional functions with ReLU neural networks
    Gong, Qi
    Kang, Wei
    Fahroo, Fariba
    SYSTEMS & CONTROL LETTERS, 2023, 175
  • [35] Error bounds for approximations with deep ReLU networks
    Yarotsky, Dmitry
    NEURAL NETWORKS, 2017, 94 : 103 - 114
  • [36] Generalized kernels for reconstructing bandlimited functions and their Hilbert transform
    Boche, H
    Protzmann, M
    1998 CONFERENCE ON PRECISION ELECTROMAGNETIC MEASUREMENTS DIGEST, 1998, : 426 - 427
  • [37] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [38] The recovery of ridge functions on the hypercube suffers from the curse of dimensionality
    Doerr, Benjamin
    Mayer, Sebastian
    JOURNAL OF COMPLEXITY, 2021, 63
  • [39] Global optimization of objective functions represented by ReLU networks
    Christopher A. Strong
    Haoze Wu
    Aleksandar Zeljić
    Kyle D. Julian
    Guy Katz
    Clark Barrett
    Mykel J. Kochenderfer
    Machine Learning, 2023, 112 : 3685 - 3712
  • [40] Approximation By Bandlimited Functions, Generalized K-Functionals and Generalized Moduli of Smoothness
    S. Artamonov
    K. Runovski
    H.-J. Schmeisser
    Analysis Mathematica, 2019, 45 : 1 - 24