Construction and approximation for a class of feedforward neural networks with sigmoidal function

被引:0
|
作者
Meng, Xinhong [1 ]
Yan, Jinyao [1 ]
Ye, Hailiang [2 ]
Cao, Feilong [2 ]
机构
[1] Quanzhou Univ Informat Engn, Sch Elect & Commun Engn, Quanzhou 362000, Fujian, Peoples R China
[2] China Jiliang Univ, Coll Sci, Hangzhou 310018, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Feedforward neural networks; approximation; direct theorem; inverse theorem; equivalence characterization; UNIVERSAL APPROXIMATION; UNIFORM APPROXIMATION; OPERATORS; ERRORS;
D O I
10.1142/S0219691323500285
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
As we know, feedforward neural networks (FNNs) with sigmoidal activation function are universal approximators. Theoretically, for any continuous function defined on a compact set, there exists an FNN such that the FNN can approximate the function with arbitrary accuracy, which constitutes a theoretical guarantee that FNN can be used as an efficient learning machine. This paper addresses the construction and approximation for FNNs. We construct an FNN with sigmoidal activation function and estimate its approximation error. In particular, an inverse theorem of the approximation is established, which implies the equivalence characterization theorem of the approximation and reveals the relationship between the topological structure of the FNN and its approximation ability. As keys in this study, the concepts of modulus of continuity of function, K-functional, and their relationship are utilized, and two Bernstein-type inequalities are established.
引用
收藏
页数:12
相关论文
共 50 条