Approximation Spaces of Deep Neural Networks

被引:0
|
作者
Rémi Gribonval
Gitta Kutyniok
Morten Nielsen
Felix Voigtlaender
机构
[1] Univ Lyon,Department of Mathematics
[2] EnsL,Department of Mathematical Sciences
[3] UCBL,Department of Scientific Computing
[4] CNRS,undefined
[5] Inria,undefined
[6] LIP,undefined
[7] Ludwig-Maximilians-Universität München,undefined
[8] Aalborg University,undefined
[9] Katholische Universität Eichstätt-Ingolstadt,undefined
来源
关键词
Deep neural networks; Sparsely connected networks; Approximation spaces; Besov spaces; Direct estimates; Inverse estimates; Piecewise polynomials; ReLU activation function; Primary 82C32; 41A65; Secondary 68T05; 41A46; 42C40;
D O I
暂无
中图分类号
学科分类号
摘要
We study the expressivity of deep neural networks. Measuring a network’s complexity by its number of connections or by its number of neurons, we consider the class of functions for which the error of best approximation with networks of a given complexity decays at a certain rate when increasing the complexity budget. Using results from classical approximation theory, we show that this class can be endowed with a (quasi)-norm that makes it a linear function space, called approximation space. We establish that allowing the networks to have certain types of “skip connections” does not change the resulting approximation spaces. We also discuss the role of the network’s nonlinearity (also known as activation function) on the resulting spaces, as well as the role of depth. For the popular ReLU nonlinearity and its powers, we relate the newly constructed spaces to classical Besov spaces. The established embeddings highlight that some functions of very low Besov smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.
引用
收藏
页码:259 / 367
页数:108
相关论文
共 50 条
  • [21] Approximation of functions from Korobov spaces by shallow neural networks
    Liu, Yuqing
    Mao, Tong
    Zhou, Ding-Xuan
    [J]. INFORMATION SCIENCES, 2024, 670
  • [22] Approximation of Lipschitz Functions Using Deep Spline Neural Networks*
    Neumayer, Sebastian
    Goujon, Alexis
    Bohra, Pakshal
    Unser, Michael
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2023, 5 (02): : 306 - 322
  • [23] Deep ReLU neural networks in high-dimensional approximation
    Dung, Dinh
    Nguyen, Van Kien
    [J]. NEURAL NETWORKS, 2021, 142 : 619 - 635
  • [24] Hardware-Aware Softmax Approximation for Deep Neural Networks
    Geng, Xue
    Lin, Jie
    Zhao, Bin
    Kong, Anmin
    Aly, Mohamed M. Sabry
    Chandrasekhar, Vijay
    [J]. COMPUTER VISION - ACCV 2018, PT IV, 2019, 11364 : 107 - 122
  • [25] Deep neural networks for rotation-invariance approximation and learning
    Chui, Charles K.
    Lin, Shao-Bo
    Zhou, Ding-Xuan
    [J]. ANALYSIS AND APPLICATIONS, 2019, 17 (05) : 737 - 772
  • [26] A Unified Approximation Framework for Compressing and Accelerating Deep Neural Networks
    Ma, Yuzhe
    Chen, Ran
    Li, Wei
    Shang, Fanhua
    Yu, Wenjian
    Cho, Minsik
    Yu, Bei
    [J]. 2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 376 - 383
  • [27] Deep Neural Networks for Shimmer Approximation in Synthesized Audio Signal
    Alejandro Garcia, Mario
    Atilio Destefanis, Eduardo
    [J]. COMPUTER SCIENCE (CACIC 2017), 2018, 790 : 3 - 12
  • [28] Full Approximation of Deep Neural Networks through Efficient Optimization
    De la Parra, Cecilia
    Guntoro, Andre
    Kumar, Akash
    [J]. 2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2020,
  • [29] Quantitative Gaussian approximation of randomly initialized deep neural networks
    Basteri, Andrea
    Trevisan, Dario
    [J]. MACHINE LEARNING, 2024, 113 (09) : 6373 - 6393
  • [30] DEFORMATION STABILITY OF DEEP CONVOLUTIONAL NEURAL NETWORKS ON SOBOLEV SPACES
    Koller, Michael
    Grossmann, Johannes
    Moenich, Ullrich
    Boche, Holger
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 6872 - 6876