A Non-Sigmoidal Activation Function for Feedforward Artificial Neural Networks

被引:0
|
作者
Chandra, Pravin [1 ]
Ghose, Udayan [1 ]
Sood, Apoorvi [1 ]
机构
[1] Guru Gobind Singh Indraprastha Univ, Univ Sch Informat & Commun Technol, Sect 16C, New Delhi 110078, India
关键词
BACKPROPAGATION; APPROXIMATE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For a single hidden layer feedforward artificial neural network to possess the universal approximation property, it is sufficient that the hidden layer nodes activation functions are continuous non-polynomial function. It is not required that the activation function be a sigmoidal function. In this paper a simple continuous, bounded, non-constant, differentiable, non-sigmoid and non-polynomial function is proposed, for usage as the activation function at hidden layer nodes. The proposed activation function does require the computation of an exponential function, and thus is computationally less intensive as compared to either the log-sigmoid or the hyperbolic tangent function. On a set of 10 function approximation tasks we demonstrate the efficiency and efficacy of the usage of the proposed activation functions. The results obtained allow us to assert that, at least on the 10 function approximation tasks, the results demonstrate that in equal epochs of training, the networks using the proposed activation function reach deeper minima of the error functional and also generalize better in most of the cases, and statistically are as good as if not better than networks using the logistic function as the activation function at the hidden nodes.
引用
收藏
页数:8
相关论文
共 50 条
  • [41] A Constructive Algorithm with Adaptive Sigmoidal Function for Designing Single Hidden Layer Feedforward Neural Network
    Sharma, Sudhir Kumar
    Chandra, Pravin
    [J]. MEMS, NANO AND SMART SYSTEMS, PTS 1-6, 2012, 403-408 : 3867 - +
  • [42] Application of radial basis function and feedforward artificial neural networks to the Escherichia coli fermentation process
    Warnes, MR
    Glassey, J
    Montague, GA
    Kara, B
    [J]. NEUROCOMPUTING, 1998, 20 (1-3) : 67 - 82
  • [43] Piecewise Polynomial Activation Functions for Feedforward Neural Networks
    Ezequiel López-Rubio
    Francisco Ortega-Zamorano
    Enrique Domínguez
    José Muñoz-Pérez
    [J]. Neural Processing Letters, 2019, 50 : 121 - 147
  • [44] Software Effort and Function Points Estimation Models Based Radial Basis Function and Feedforward Artificial Neural Networks
    Sheta, Alaa
    Rine, David
    Kassaymeh, Sofian
    [J]. INTERNATIONAL JOURNAL OF NEXT-GENERATION COMPUTING, 2015, 6 (03): : 192 - 205
  • [45] A low-complexity fuzzy activation function for artificial neural networks
    Soria-Olivas, E
    Martín-Guerrero, JD
    Camps-Valls, G
    Serrano-López, AJ
    Calpe-Maravilla, J
    Gómez-Chova, L
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (06): : 1576 - 1579
  • [46] Fault tolerance of feedforward artificial neural networks - A framework of study
    Chandra, P
    Singh, Y
    [J]. PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 489 - 494
  • [47] PROCESS MONITORING USING AUTOASSOCIATIVE, FEEDFORWARD ARTIFICIAL NEURAL NETWORKS
    SKITT, PJC
    JAVED, MA
    SANDERS, SA
    HIGGINSON, AM
    [J]. JOURNAL OF INTELLIGENT MANUFACTURING, 1993, 4 (01) : 79 - 94
  • [48] On the Computability of Primitive Recursive Functions by Feedforward Artificial Neural Networks
    Kulyukin, Vladimir A.
    [J]. MATHEMATICS, 2023, 11 (20)
  • [49] WEIGHT DECAY AND RESOLUTION EFFECTS IN FEEDFORWARD ARTIFICIAL NEURAL NETWORKS
    MUNDIE, DB
    MASSENGILL, LW
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (01): : 168 - 170
  • [50] OPTIMIZATION OF THE HIDDEN UNIT FUNCTION IN FEEDFORWARD NEURAL NETWORKS
    FUJITA, O
    [J]. NEURAL NETWORKS, 1992, 5 (05) : 755 - 764