Approximation results for neural network operators activated by sigmoidal functions

被引:102
|
作者
Costarelli, Danilo [1 ]
Spigler, Renato [1 ]
机构
[1] Univ Roma Tre 1, Dipartimento Matemat, I-00146 Rome, Italy
关键词
Sigmoidal functions; Neural networks operators; Uniform approximation; Order of approximation; Lipschitz classes; ONE HIDDEN LAYER; SUPERPOSITIONS;
D O I
10.1016/j.neunet.2013.03.015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we study pointwise and uniform convergence, as well as the order of approximation, for a family of linear positive neural network operators activated by certain sigmoidal functions. Only the case of functions of one variable is considered, but it can be expected that our results can be generalized to handle multivariate functions as well. Our approach allows us to extend previously existing results. The order of approximation is studied for functions belonging to suitable Lipschitz classes and using a moment-type approach. The special cases of neural network operators activated by logistic, hyperbolic tangent, and ramp sigmoidal functions are considered. In particular, we show that for C-1-functions, the order of approximation for our operators with logistic and hyperbolic tangent functions here obtained is higher with respect to that established in some previous papers. The case of quasi-interpolation operators constructed with sigmoidal functions is also considered. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:101 / 106
页数:6
相关论文
共 50 条
  • [21] A Neural Network Approximation Based on a Parametric Sigmoidal Function
    Yun, Beong In
    MATHEMATICS, 2019, 7 (03):
  • [22] Voronovskaja Type Theorems and High-Order Convergence Neural Network Operators with Sigmoidal Functions
    Costarelli, Danilo
    Vinti, Gianluca
    MEDITERRANEAN JOURNAL OF MATHEMATICS, 2020, 17 (03)
  • [23] Voronovskaja Type Theorems and High-Order Convergence Neural Network Operators with Sigmoidal Functions
    Danilo Costarelli
    Gianluca Vinti
    Mediterranean Journal of Mathematics, 2020, 17
  • [24] Neural network Kantorovich operators activated by smooth ramp functions
    Agrawal, Purshottam N.
    Baxhaku, Behar
    MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2025, 48 (01) : 563 - 589
  • [25] Neural network interpolation operators activated by smooth ramp functions
    Qian, Yunyou
    Yu, Dansheng
    ANALYSIS AND APPLICATIONS, 2022, 20 (04) : 791 - 813
  • [26] Interpolation for Neural Network Operators Activated by Smooth Ramp Functions
    Baxhaku, Fesal
    Berisha, Artan
    Baxhaku, Behar
    COMPUTATION, 2024, 12 (07)
  • [27] Multivariate neural network operators activated by smooth ramp functions
    Baxhaku, Fesal
    Berisha, Artan
    Agrawal, Purshottam Narain
    Baxhaku, Behar
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 269
  • [28] A sigmoidal radial basis function neural network for function approximation
    Tsai, JR
    Chung, PC
    Chang, CI
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 496 - 501
  • [29] Constructive Approximation by Superposition of Sigmoidal Functions
    Danilo Costarelli
    Renato Spigler
    AnalysisinTheoryandApplications, 2013, 29 (02) : 169 - 196
  • [30] On the approximation capability of neural networks using bell-shaped and sigmoidal functions
    Ciuca, I
    1998 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS, VOLS 1-5, 1998, : 1845 - 1850