Comparative analysis of activation functions in neural networks

被引:4
|
作者
Kamalov, Firuz [1 ]
Nazir, Amril [2 ]
Safaraliev, Murodbek [3 ]
Cherukuri, Aswani Kumar [4 ]
Zgheib, Rita [5 ]
机构
[1] Canadian Univ Dubai, Dept Elect Engn, Dubai, U Arab Emirates
[2] Zayed Univ, Dept Informat Syst, Abu Dhabi, U Arab Emirates
[3] Ural Fed Univ, Automated Elect Syst Dept, Ekaterinburg, Russia
[4] Vellore Inst Technol, Sch IT & Engn, Vellore, Tamil Nadu, India
[5] Canadian Univ Dubai, Dept Comp Sci, Dubai, U Arab Emirates
关键词
activation function; neural networks; ReLU; sigmoid; loss function;
D O I
10.1109/ICECS53924.2021.9665646
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Although the impact of activations on the accuracy of neural networks has been covered in the literature, there is little discussion about the relationship between the activations and the geometry of neural network model. In this paper, we examine the effects of various activation functions on the geometry of the model within the feature space. In particular, we investigate the relationship between the activations in the hidden and output layers, the geometry of the trained neural network model, and the model performance. We present visualizations of the trained neural network models to help researchers better understand and intuit the effects of activation functions on the models.
引用
收藏
页数:6
相关论文
共 50 条
  • [31] On the performance of pairings of activation and loss functions in neural networks
    Soares, Rodrigo G. F.
    Pereira, Enieson J. S.
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 326 - 333
  • [32] A Formal Characterization of Activation Functions in Deep Neural Networks
    Amrouche, Massi
    Stipanovic, Dusan M.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2153 - 2166
  • [33] Tunable Nonlinear Activation Functions for Optical Neural Networks
    Williamson, Ian A. D.
    Hughes, Tyler W.
    Minkov, Momchil
    Bartlett, Ben
    Pai, Sunil
    Fan, Shanhui
    [J]. 2020 CONFERENCE ON LASERS AND ELECTRO-OPTICS (CLEO), 2020,
  • [34] Novel Neuronal Activation Functions for Feedforward Neural Networks
    Mehmet Önder Efe
    [J]. Neural Processing Letters, 2008, 28 : 63 - 79
  • [35] ON THE COMPLEXITY OF TRAINING NEURAL NETWORKS WITH CONTINUOUS ACTIVATION FUNCTIONS
    DASGUPTA, B
    SIEGELMANN, HT
    SONTAG, E
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1995, 6 (06): : 1490 - 1504
  • [36] Approximation rates for neural networks with general activation functions
    Siegel, Jonathan W.
    Xu, Jinchao
    [J]. NEURAL NETWORKS, 2020, 128 : 313 - 321
  • [37] Reconfigurable Activation Functions in Integrated Optical Neural Networks
    Rausell Campo, Jose Roberto
    Perez-Lopez, Daniel
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS, 2022, 28 (04)
  • [38] The loss surfaces of neural networks with general activation functions
    Baskerville, Nicholas P.
    Keating, Jonathan P.
    Mezzadri, Francesco
    Najnudel, Joseph
    [J]. JOURNAL OF STATISTICAL MECHANICS-THEORY AND EXPERIMENT, 2021, 2021 (06):
  • [39] On Multistability of Competitive Neural Networks with Discontinuous Activation Functions
    Nie, Xiaobing
    Zheng, Wei Xing
    [J]. 2014 4TH AUSTRALIAN CONTROL CONFERENCE (AUCC), 2014, : 245 - 250
  • [40] Piecewise Polynomial Activation Functions for Feedforward Neural Networks
    Lopez-Rubio, Ezequiel
    Ortega-Zamorano, Francisco
    Dominguez, Enrique
    Munoz-Perez, Jose
    [J]. NEURAL PROCESSING LETTERS, 2019, 50 (01) : 121 - 147