A Non-Sigmoidal Activation Function for Feedforward Artificial Neural Networks

被引:0
|
作者
Chandra, Pravin [1 ]
Ghose, Udayan [1 ]
Sood, Apoorvi [1 ]
机构
[1] Guru Gobind Singh Indraprastha Univ, Univ Sch Informat & Commun Technol, Sect 16C, New Delhi 110078, India
关键词
BACKPROPAGATION; APPROXIMATE;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
For a single hidden layer feedforward artificial neural network to possess the universal approximation property, it is sufficient that the hidden layer nodes activation functions are continuous non-polynomial function. It is not required that the activation function be a sigmoidal function. In this paper a simple continuous, bounded, non-constant, differentiable, non-sigmoid and non-polynomial function is proposed, for usage as the activation function at hidden layer nodes. The proposed activation function does require the computation of an exponential function, and thus is computationally less intensive as compared to either the log-sigmoid or the hyperbolic tangent function. On a set of 10 function approximation tasks we demonstrate the efficiency and efficacy of the usage of the proposed activation functions. The results obtained allow us to assert that, at least on the 10 function approximation tasks, the results demonstrate that in equal epochs of training, the networks using the proposed activation function reach deeper minima of the error functional and also generalize better in most of the cases, and statistically are as good as if not better than networks using the logistic function as the activation function at the hidden nodes.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Sigmoidal Function Classes for Feedforward Artificial Neural Networks
    Pravin Chandra
    [J]. Neural Processing Letters, 2003, 18 (3) : 205 - 215
  • [2] Sigmoidal function classes for feedforward artificial neural networks
    Chandra, P
    [J]. NEURAL PROCESSING LETTERS, 2003, 18 (03) : 185 - 195
  • [3] Bi-modal derivative adaptive activation function sigmoidal feedforward artificial neural networks
    Mishra, Akash
    Chandra, Pravin
    Ghose, Udayan
    Sodhi, Sartaj Singh
    [J]. APPLIED SOFT COMPUTING, 2017, 61 : 983 - 994
  • [4] Construction and approximation for a class of feedforward neural networks with sigmoidal function
    Meng, Xinhong
    Yan, Jinyao
    Ye, Hailiang
    Cao, Feilong
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2023, 21 (06)
  • [5] A New Weight Initialization Method for Sigmoidal Feedforward Artificial Neural Networks
    Sodhi, Sartaj Singh
    Chandra, Pravin
    Tanwar, Sharad
    [J]. PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 291 - 298
  • [6] Bi-modal derivative activation function for sigmoidal feedforward networks
    Sodhi, Sartaj Singh
    Chandra, Pravin
    [J]. NEUROCOMPUTING, 2014, 143 : 182 - 196
  • [7] Am activation function adapting training algorithm for sigmoidal feedforward networks
    Chandra, P
    Singh, Y
    [J]. NEUROCOMPUTING, 2004, 61 : 429 - 437
  • [8] Interval Based Weight Initialization Method for Sigmoidal Feedforward Artificial Neural Networks
    Sodhi, Sartaj Singh
    Chandra, Pravin
    [J]. 2ND AASRI CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND BIOINFORMATICS, 2014, 6 : 19 - 25
  • [9] Function estimation by feedforward sigmoidal networks with bounded weights
    Rao, NSV
    Protopopescu, V
    [J]. NEURAL PROCESSING LETTERS, 1998, 7 (03) : 125 - 131
  • [10] Function Estimation by Feedforward Sigmoidal Networks with Bounded Weights
    Nageswara S.V. Rao
    Vladimir Protopopescu
    [J]. Neural Processing Letters, 1998, 7 : 125 - 131