Comparative analysis of activation functions in neural networks

被引:4
|
作者
Kamalov, Firuz [1 ]
Nazir, Amril [2 ]
Safaraliev, Murodbek [3 ]
Cherukuri, Aswani Kumar [4 ]
Zgheib, Rita [5 ]
机构
[1] Canadian Univ Dubai, Dept Elect Engn, Dubai, U Arab Emirates
[2] Zayed Univ, Dept Informat Syst, Abu Dhabi, U Arab Emirates
[3] Ural Fed Univ, Automated Elect Syst Dept, Ekaterinburg, Russia
[4] Vellore Inst Technol, Sch IT & Engn, Vellore, Tamil Nadu, India
[5] Canadian Univ Dubai, Dept Comp Sci, Dubai, U Arab Emirates
关键词
activation function; neural networks; ReLU; sigmoid; loss function;
D O I
10.1109/ICECS53924.2021.9665646
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Although the impact of activations on the accuracy of neural networks has been covered in the literature, there is little discussion about the relationship between the activations and the geometry of neural network model. In this paper, we examine the effects of various activation functions on the geometry of the model within the feature space. In particular, we investigate the relationship between the activations in the hidden and output layers, the geometry of the trained neural network model, and the model performance. We present visualizations of the trained neural network models to help researchers better understand and intuit the effects of activation functions on the models.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] Learning Activation Functions in Deep (Spline) Neural Networks
    Bohra, Pakshal
    Campos, Joaquim
    Gupta, Harshit
    Aziznejad, Shayan
    Unser, Michael
    [J]. IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 295 - 309
  • [42] Novel neuronal activation functions for feedforward neural networks
    Efe, Mehmet Oender
    [J]. NEURAL PROCESSING LETTERS, 2008, 28 (02) : 63 - 79
  • [43] Piecewise Polynomial Activation Functions for Feedforward Neural Networks
    Ezequiel López-Rubio
    Francisco Ortega-Zamorano
    Enrique Domínguez
    José Muñoz-Pérez
    [J]. Neural Processing Letters, 2019, 50 : 121 - 147
  • [44] Improving the Performance of Neural Networks with an Ensemble of Activation Functions
    Nandi, Arijit
    Jana, Nanda Dulal
    Das, Swagatam
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [45] Stability Analysis of Quaternion-Valued Neutral Neural Networks with Generalized Activation Functions
    Yanqiu Wu
    Zhengwen Tu
    Nina Dai
    Liangwei Wang
    Ning Hu
    Tao Peng
    [J]. Cognitive Computation, 2024, 16 : 392 - 403
  • [46] An analysis of weight initialization methods in connection with different activation functions forfeedforward neural networks
    Wong, Kit
    Dornberger, Rolf
    Hanne, Thomas
    [J]. EVOLUTIONARY INTELLIGENCE, 2024, 17 (03) : 2081 - 2089
  • [47] Stability Analysis of Quaternion-Valued Neutral Neural Networks with Generalized Activation Functions
    Wu, Yanqiu
    Tu, Zhengwen
    Dai, Nina
    Wang, Liangwei
    Hu, Ning
    Peng, Tao
    [J]. COGNITIVE COMPUTATION, 2024, 16 (01) : 392 - 403
  • [48] Stability analysis of fractional-order Hopfield neural networks with discontinuous activation functions
    Zhang, Shuo
    Yu, Yongguang
    Wang, Qing
    [J]. NEUROCOMPUTING, 2016, 171 : 1075 - 1084
  • [49] Stability Analysis on Neural Networks with A Class of Mexican-Hat-type Activation Functions
    Wang, Lili
    [J]. 2014 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE, ELECTRONICS AND ELECTRICAL ENGINEERING (ISEEE), VOLS 1-3, 2014, : 1150 - 1154
  • [50] Stability analysis for the generalized Hopfield neural networks with multi-level activation functions
    Liu, Yiguang
    You, Zhisheng
    [J]. NEUROCOMPUTING, 2008, 71 (16-18) : 3595 - 3601