Lower bounds on the complexity of approximating continuous functions by sigmoidal neural networks

被引:0
|
作者
Schmitt, M [1 ]
机构
[1] Ruhr Univ Bochum, Fak Math, Lehrstuhl Math & Informat, D-44780 Bochum, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We calculate lower bounds on the size of sigmoidal neural networks that approximate continuous functions. In particular, we show that for the approximation of polynomials the network size has to grow as Omega>(*) over bar *((log k)(1/4)) where k is the degree of the polynomials. This bound is valid for any input dimension, i.e. independently of the number of variables. The result is obtained by introducing a new method employing upper bounds on the Vapnik-Chervonenkis dimension for proving lower bounds on the size of networks that approximate continuous functions.
引用
收藏
页码:328 / 334
页数:7
相关论文
共 50 条