DEEP RELU NETWORKS OVERCOME THE CURSE OF DIMENSIONALITY FOR GENERALIZED BANDLIMITED FUNCTIONS

被引:17
|
作者
Montanelli, Hadrien [1 ]
Yang, Haizhao [2 ]
Du, Qiang [3 ]
机构
[1] Ecole Polytech, Ctr Math Appl, Palaiseau, France
[2] Purdue Univ, Dept Math, W Lafayette, IN 47907 USA
[3] Columbia Univ, Dept Appl Phys & Appl Math, New York, NY USA
来源
JOURNAL OF COMPUTATIONAL MATHEMATICS | 2021年 / 39卷 / 06期
关键词
Machine learning; Deep ReLU networks; Curse of dimensionality; Approxima-tion theory; Bandlimited functions; Chebyshev polynomials; ERROR-BOUNDS; OPTIMAL APPROXIMATION; SUPERPOSITION; SMOOTH;
D O I
10.4208/jcm.2007-m2019-0239
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We prove a theorem concerning the approximation of generalized bandlimited multivariate functions by deep ReLU networks for which the curse of the dimensionality is overcome. Our theorem is based on a result by Maurey and on the ability of deep ReLU networks to approximate Chebyshev polynomials and analytic functions efficiently.
引用
收藏
页码:801 / 815
页数:15
相关论文
共 50 条
  • [41] Global optimization of objective functions represented by ReLU networks
    Strong, Christopher A.
    Wu, Haoze
    Zeljic, Aleksandar
    Julian, Kyle D.
    Katz, Guy
    Barrett, Clark
    Kochenderfer, Mykel J.
    MACHINE LEARNING, 2023, 112 (10) : 3685 - 3712
  • [42] Tackling the curse of dimensionality with physics-informed neural networks
    Hu, Zheyuan
    Shukla, Khemraj
    Karniadakis, George Em
    Kawaguchi, Kenji
    NEURAL NETWORKS, 2024, 176
  • [43] Vanishing Curvature in Randomly Initialized Deep ReLU Networks
    Orvieto, Antonio
    Kohler, Jonas
    Pavllo, Dario
    Hofmann, Thomas
    Lucchi, Aurelien
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [44] Curse of Dimensionality for TSK Fuzzy Neural Networks: Explanation and Solutions
    Cui, Yuqi
    Wu, Dongrui
    Xu, Yifan
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [45] Taking on the curse of dimensionality in joint distributions using neural networks
    Bengio, S
    Bengio, Y
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2000, 11 (03): : 550 - 557
  • [46] Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Song, Linhao
    Fan, Jun
    Chen, Di-Rong
    Zhou, Ding-Xuan
    JOURNAL OF FOURIER ANALYSIS AND APPLICATIONS, 2023, 29 (04)
  • [47] Approximation of Nonlinear Functionals Using Deep ReLU Networks
    Linhao Song
    Jun Fan
    Di-Rong Chen
    Ding-Xuan Zhou
    Journal of Fourier Analysis and Applications, 2023, 29
  • [48] A generative model for fBm with deep ReLU neural networks
    Allouche, Michaël
    Girard, Stéphane
    Gobet, Emmanuel
    Journal of Complexity, 2022, 73
  • [49] Unboundedness of Linear Regions of Deep ReLU Neural Networks
    Ponomarchuk, Anton
    Koutschan, Christoph
    Moser, Bernhard
    DATABASE AND EXPERT SYSTEMS APPLICATIONS, DEXA 2022 WORKSHOPS, 2022, 1633 : 3 - 10
  • [50] On a Fitting of a Heaviside Function by Deep ReLU Neural Networks
    Hagiwara, Katsuyuki
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 59 - 69