Lower bounds on the complexity of approximating continuous functions by sigmoidal neural networks

被引:0
|
作者
Schmitt, M [1 ]
机构
[1] Ruhr Univ Bochum, Fak Math, Lehrstuhl Math & Informat, D-44780 Bochum, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous functions. In particular, we show that for the approximation of polynomials the network size has to grow as Omega>(*) over bar *((log k)(1/4)) where k is the degree of the polynomials. This bound is valid for any input dimension, i.e. independently of the number of variables. The result is obtained by introducing a new method employing upper bounds on the Vapnik-Chervonenkis dimension for proving lower bounds on the size of networks that approximate continuous functions.
引用
收藏
页码:328 / 334
页数:7
相关论文
共 50 条
  • [31] Hilbert function and complexity lower bounds for symmetric Boolean functions
    Bernasconi, A
    Egidi, L
    INFORMATION AND COMPUTATION, 1999, 153 (01) : 1 - 25
  • [32] Explicit Lower Bounds for Communication Complexity of PSM for Concrete Functions
    Shinagawa, Kazumasa
    Nuida, Koji
    PROGRESS IN CRYPTOLOGY - INDOCRYPT 2023, PT II, 2024, 14460 : 45 - 61
  • [33] Complexity of Gaussian-radial-basis networks approximating smooth functions
    Kainen, Paul C.
    Kurkova, Vera
    Sanguineti, Marcello
    JOURNAL OF COMPLEXITY, 2009, 25 (01) : 63 - 74
  • [34] ARE LOWER BOUNDS ON THE COMPLEXITY LOWER BOUNDS FOR UNIVERSAL CIRCUITS
    NIGMATULLIN, RG
    LECTURE NOTES IN COMPUTER SCIENCE, 1985, 199 : 331 - 340
  • [35] Constructive lower bounds on model complexity of shallow perceptron networks
    Kurkova, Vera
    NEURAL COMPUTING & APPLICATIONS, 2018, 29 (07): : 305 - 315
  • [36] LOWER BOUNDS ON COMMUNICATION COMPLEXITY IN DISTRIBUTED COMPUTER NETWORKS.
    Tiwari, Prasoon
    1600, (34):
  • [37] LOWER BOUNDS ON COMMUNICATION COMPLEXITY IN DISTRIBUTED COMPUTER-NETWORKS
    TIWARI, P
    JOURNAL OF THE ACM, 1987, 34 (04) : 921 - 938
  • [38] Constructive lower bounds on model complexity of shallow perceptron networks
    Věra Kůrková
    Neural Computing and Applications, 2018, 29 : 305 - 315
  • [39] Improved Bounds on Neural Complexity for Representing Piecewise Linear Functions
    Chen, Kuan-Lin
    Garudadri, Harinath
    Rao, Bhaskar D.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [40] Size and Depth Separation in Approximating Benign Functions with Neural Networks
    Vardi, Gal
    Reichman, Daniel
    Pitassi, Toniann
    Shamir, Ohad
    CONFERENCE ON LEARNING THEORY, VOL 134, 2021, 134