Approximation Spaces of Deep Neural Networks

被引:0
|
作者
Rémi Gribonval
Gitta Kutyniok
Morten Nielsen
Felix Voigtlaender
机构
[1] Univ Lyon,Department of Mathematics
[2] EnsL,Department of Mathematical Sciences
[3] UCBL,Department of Scientific Computing
[4] CNRS,undefined
[5] Inria,undefined
[6] LIP,undefined
[7] Ludwig-Maximilians-Universität München,undefined
[8] Aalborg University,undefined
[9] Katholische Universität Eichstätt-Ingolstadt,undefined
来源
关键词
Deep neural networks; Sparsely connected networks; Approximation spaces; Besov spaces; Direct estimates; Inverse estimates; Piecewise polynomials; ReLU activation function; Primary 82C32; 41A65; Secondary 68T05; 41A46; 42C40;
D O I
暂无
中图分类号
学科分类号
摘要
We study the expressivity of deep neural networks. Measuring a network’s complexity by its number of connections or by its number of neurons, we consider the class of functions for which the error of best approximation with networks of a given complexity decays at a certain rate when increasing the complexity budget. Using results from classical approximation theory, we show that this class can be endowed with a (quasi)-norm that makes it a linear function space, called approximation space. We establish that allowing the networks to have certain types of “skip connections” does not change the resulting approximation spaces. We also discuss the role of the network’s nonlinearity (also known as activation function) on the resulting spaces, as well as the role of depth. For the popular ReLU nonlinearity and its powers, we relate the newly constructed spaces to classical Besov spaces. The established embeddings highlight that some functions of very low Besov smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.
引用
收藏
页码:259 / 367
页数:108
相关论文
共 50 条
  • [1] Approximation Spaces of Deep Neural Networks
    Gribonval, Remi
    Kutyniok, Gitta
    Nielsen, Morten
    Voigtlaender, Felix
    [J]. CONSTRUCTIVE APPROXIMATION, 2022, 55 (01) : 259 - 367
  • [2] Approximation in shift-invariant spaces with deep ReLU neural networks
    Yang, Yunfei
    Li, Zhen
    Wang, Yang
    [J]. NEURAL NETWORKS, 2022, 153 : 269 - 281
  • [3] Approximation of functions from Korobov spaces by deep convolutional neural networks
    Tong Mao
    Ding-Xuan Zhou
    [J]. Advances in Computational Mathematics, 2022, 48
  • [4] Approximation of functions from Korobov spaces by deep convolutional neural networks
    Mao, Tong
    Zhou, Ding-Xuan
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2022, 48 (06)
  • [5] Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev and Besov Spaces
    Siegel, Jonathan W.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [6] Universal approximation with neural networks on function spaces
    Kumagai, Wataru
    Sannai, Akiyoshi
    Kawano, Makoto
    [J]. JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2024, 36 (07) : 1089 - 1100
  • [7] Continuity of approximation by neural networks in Lp spaces
    Kainen, PC
    Kurková, V
    Vogt, A
    [J]. ANNALS OF OPERATIONS RESEARCH, 2001, 101 (1-4) : 143 - 147
  • [8] Continuity of Approximation by Neural Networks in Lp Spaces
    Paul C. Kainen
    Věra Kůrková
    Andrew Vogt
    [J]. Annals of Operations Research, 2001, 101 : 143 - 147
  • [9] Limitations on approximation by deep and shallow neural networks
    Petrova, Guergana
    Wojtaszczyk, Przemyslaw
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [10] Provable approximation properties for deep neural networks
    Shaham, Uri
    Cloninger, Alexander
    Coifman, Ronald R.
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2018, 44 (03) : 537 - 557