On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights

被引:1
|
作者
Yu, Dansheng [1 ]
Qian, Yunyou [1 ]
Li, Fengjun [2 ]
机构
[1] Hangzhou Normal Univ, Dept Math, Hangzhou 310036, Zhejiang, Peoples R China
[2] Ningxia Univ, Sch Math & Stat, Yinchuan 750021, Ningxia, Peoples R China
来源
ANALYSIS IN THEORY AND APPLICATIONS | 2023年 / 39卷 / 01期
关键词
Approximation rate; modulus of continuity; modulus of smoothness; neural network operators; MULTILAYER FEEDFORWARD NETWORKS; UNIVERSAL APPROXIMATION; OPERATORS; BOUNDS;
D O I
10.4208/ata.OA-2021-0006
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Recently, Li [16] introduced three kinds of single-hidden layer feed-forward neural networks with optimized piecewise linear activation functions and fixed weights, and obtained the upper and lower bound estimations on the approximation accuracy of the FNNs, for continuous function defined on bounded intervals. In the present paper, we point out that there are some errors both in the definitions of the FNNs and in the proof of the upper estimations in [16]. By using new methods, we also give right approximation rate estimations of the approximation by Li's neural networks.
引用
收藏
页码:93 / 104
页数:12
相关论文
共 50 条