Activation Functions and Their Characteristics in Deep Neural Networks

被引:0
|
作者
Ding, Bin [1 ]
Qian, Huimin [1 ]
Zhou, Jun [1 ]
机构
[1] Hohai Univ, Coll Energy & Elect Engn, Nanjing 211100, Peoples R China
关键词
neural network; deep architecture; activation function;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep neural networks have gained remarkable achievements in many research areas, especially in computer vision, and natural language processing. The great successes of deep neural networks depend on several aspects in which the development of activation function is one of the most important elements. Being aware of this, a number of researches have concentrated on the performance improvements after the revision of a certain activation function in some specified neural networks. We have noticed that there are few papers to review thoroughly the activation functions employed by the neural networks. Therefore, considering the impact of improving the performance of neural networks with deep architectures, the status and the developments of commonly used activation functions will be investigated in this paper. More specifically, the definitions, the impacts on the neural networks, and the advantages and disadvantages of quite a few activation functions will be discussed in this paper. Furthermore, experimental results on the dataset MNIST are employed to compare the performance of different activation functions.
引用
收藏
页码:1836 / 1841
页数:6
相关论文
共 50 条
  • [1] Deep Neural Networks with Multistate Activation Functions
    Cai, Chenghao
    Xu, Yanyan
    Ke, Dengfeng
    Su, Kaile
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2015, 2015
  • [2] A Formal Characterization of Activation Functions in Deep Neural Networks
    Amrouche, Massi
    Stipanovic, Dusan M.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2153 - 2166
  • [3] Learning Activation Functions in Deep (Spline) Neural Networks
    Bohra, Pakshal
    Campos, Joaquim
    Gupta, Harshit
    Aziznejad, Shayan
    Unser, Michael
    [J]. IEEE OPEN JOURNAL OF SIGNAL PROCESSING, 2020, 1 : 295 - 309
  • [4] Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions
    Jagtap, Ameya D.
    Shin, Yeonjong
    Kawaguchi, Kenji
    Karniadakis, George Em
    [J]. NEUROCOMPUTING, 2022, 468 : 165 - 180
  • [5] Effective Activation Functions for Homomorphic Evaluation of Deep Neural Networks
    Obla, Srinath
    Gong, Xinghan
    Aloufi, Asma
    Hu, Peizhao
    Takabi, Daniel
    [J]. IEEE ACCESS, 2020, 8 : 153098 - 153112
  • [6] Activation Functions of Deep Neural Networks for Polar Decoding Applications
    Seo, Jihoon
    Lee, Juyul
    Kim, Keunyoung
    [J]. 2017 IEEE 28TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR, AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2017,
  • [7] Approximating smooth functions by deep neural networks with sigmoid activation function
    Langer, Sophie
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2021, 182
  • [8] Simple Electromagnetic Analysis Against Activation Functions of Deep Neural Networks
    Takatoi, Go
    Sugawara, Takeshi
    Sakiyama, Kazuo
    Li, Yang
    [J]. APPLIED CRYPTOGRAPHY AND NETWORK SECURITY WORKSHOPS, ACNS 2020, 2020, 12418 : 181 - 197
  • [9] Smooth Function Approximation by Deep Neural Networks with General Activation Functions
    Ohn, Ilsang
    Kim, Yongdai
    [J]. ENTROPY, 2019, 21 (07)
  • [10] Improving the Performance of Deep Neural Networks Using Two Proposed Activation Functions
    Alkhouly, Asmaa A.
    Mohammed, Ammar
    Hefny, Hesham A.
    [J]. IEEE ACCESS, 2021, 9 : 82249 - 82271