Lower bounds for approximation by MLP neural networks

被引:113
|
作者
Maiorov, V [1 ]
Pinkus, A [1 ]
机构
[1] Technion Israel Inst Technol, Dept Math, IL-32000 Haifa, Israel
关键词
multilayer feedforward perceptron model; degree of approximation; lower bounds; Kolmogorov superposition theorem;
D O I
10.1016/S0925-2312(98)00111-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The degree of approximation by a single hidden layer MLP model with n units in the hidden layer is bounded below by the degree of approximation by a linear combination of n ridge functions. We prove that there exists an analytic, strictly monotone, sigmoidal activation function for which this lower bound is essentially attained. We also prove, using this same activation function, that one can approximate arbitrarily well any continuous function on any compact domain by a two hidden layer MLP using a fixed finite number of units in each layer. (C) 1999 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:81 / 91
页数:11
相关论文
共 50 条