Deep ReLU neural networks in high-dimensional approximation

被引:15
|
作者
Dung, Dinh [1 ]
Nguyen, Van Kien [2 ]
机构
[1] Vietnam Natl Univ, Informat Technol Inst, Hanoi 144 Xuan Thuy, Hanoi, Vietnam
[2] Univ Transport & Commun, Fac Basic Sci, 3 Cau Giay St, Hanoi, Vietnam
关键词
Deep ReLU neural network; Computation complexity; High-dimensional approximation; Sparse-grid sampling; Continuous piece-wise linear functions; Holder-Zygmund space of mixed smoothness; SPARSE GRIDS; ERROR-BOUNDS; SMOOTH; SPACES;
D O I
10.1016/j.neunet.2021.07.027
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We study the computation complexity of deep ReLU (Rectified Linear Unit) neural networks for the approximation of functions from the Holder-Zygmund space of mixed smoothness defined on the d-dimensional unit cube when the dimension d may be very large. The approximation error is measured in the norm of isotropic Sobolev space. For every function f from the Holder-Zygmund space of mixed smoothness, we explicitly construct a deep ReLU neural network having an output that approximates f with a prescribed accuracy epsilon, and prove tight dimension-dependent upper and lower bounds of the computation complexity of the approximation, characterized as the size and depth of this deep ReLU neural network, explicitly in d and epsilon. The proof of these results in particular, relies on the approximation by sparse-grid sampling recovery based on the Faber series. (C) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页码:619 / 635
页数:17
相关论文
共 50 条
  • [1] Approximation in LP(μ) with deep ReLU neural networks
    Voigtlaender, Felix
    Petersen, Philipp
    [J]. 2019 13TH INTERNATIONAL CONFERENCE ON SAMPLING THEORY AND APPLICATIONS (SAMPTA), 2019,
  • [2] Efficient Approximation of High-Dimensional Functions With Neural Networks
    Cheridito, Patrick
    Jentzen, Arnulf
    Rossmannek, Florian
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (07) : 3079 - 3093
  • [3] Model Complexity of Neural Networks in High-Dimensional Approximation
    Kurkova, Vera
    [J]. RECENT ADVANCES IN INTELLIGENT ENGINEERING SYSTEMS, 2012, 378 : 151 - 160
  • [4] Neural networks trained with high-dimensional functions approximation data in high-dimensional space
    Zheng, Jian
    Wang, Jianfeng
    Chen, Yanping
    Chen, Shuping
    Chen, Jingjin
    Zhong, Wenlong
    Wu, Wenling
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2021, 41 (02) : 3739 - 3750
  • [5] Neural networks trained with high-dimensional functions approximation data in high-dimensional space
    Zheng, Jian
    Wang, Jianfeng
    Chen, Yanping
    Chen, Shuping
    Chen, Jingjin
    Zhong, Wenlong
    Wu, Wenling
    [J]. Journal of Intelligent and Fuzzy Systems, 2021, 41 (02): : 3739 - 3750
  • [6] Effective approximation of high-dimensional space using neural networks
    Zheng, Jian
    Wang, Jianfeng
    Chen, Yanping
    Chen, Shuping
    Chen, Jingjin
    Zhong, Wenlong
    Wu, Wenling
    [J]. JOURNAL OF SUPERCOMPUTING, 2022, 78 (03): : 4377 - 4397
  • [7] Effective approximation of high-dimensional space using neural networks
    Jian Zheng
    Jianfeng Wang
    Yanping Chen
    Shuping Chen
    Jingjin Chen
    Wenlong Zhong
    Wenling Wu
    [J]. The Journal of Supercomputing, 2022, 78 : 4377 - 4397
  • [8] Nonlinear Approximation and (Deep) ReLU Networks
    Daubechies, I.
    DeVore, R.
    Foucart, S.
    Hanin, B.
    Petrova, G.
    [J]. CONSTRUCTIVE APPROXIMATION, 2022, 55 (01) : 127 - 172
  • [9] Approximation in shift-invariant spaces with deep ReLU neural networks
    Yang, Yunfei
    Li, Zhen
    Wang, Yang
    [J]. NEURAL NETWORKS, 2022, 153 : 269 - 281
  • [10] Low dimensional approximation and generalization of multivariate functionson smooth manifolds using deep ReLU neural networks
    Labate, Demetrio
    Shi, Ji
    [J]. NEURAL NETWORKS, 2024, 174