KAF plus RSigELU: a nonlinear and kernel-based activation function for deep neural networks

被引:6
|
作者
Kilicarslan, Serhat [1 ]
Celik, Mete [2 ]
机构
[1] Bandirma Onyedi Eylul Univ, Software Engn Dept, Bandirma, Balikesir, Turkey
[2] Erciyes Univ, Dept Comp Engn, TR-38039 Kayseri, Turkey
来源
NEURAL COMPUTING & APPLICATIONS | 2022年 / 34卷 / 16期
关键词
Kernel-based activation function (KAF); KAF plus RSigELUS; KAF plus RSigELUD; CNN; Deep neural network;
D O I
10.1007/s00521-022-07211-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Activation functions (AFs) are the basis for neural network architectures used in real-world problems to accurately model and learn complex relationships between variables. They are preferred to process the input information coming to the network and to produce the corresponding output. The kernel-based activation function (KAF) offers an extended version of ReLU and sigmoid AFs. Therefore, KAF faced with the problems of bias shift originating from the negative region, vanishing gradient, adaptability, flexibility, and neuron death in parameters during the learning process. In this study, hybrid KAF + RSigELUS and KAF + RSigELUD AFs, which are extended versions of KAF, are proposed. In the proposed AFs, the gauss kernel function is used. The proposed KAF + RSigELUS and KAF + RSigELUD AFs are effective in the positive, negative, and linear activation regions. Performance evaluations of them were conducted on the MNIST, Fashion MNIST, CIFAR-10, and SVHN benchmark datasets. The experimental evaluations show that the proposed AFs overcome existing problems and outperformed ReLU, LReLU, ELU, PReLU, and KAF AFs.
引用
收藏
页码:13909 / 13923
页数:15
相关论文
共 50 条
  • [1] KAF + RSigELU: a nonlinear and kernel-based activation function for deep neural networks
    Serhat Kiliçarslan
    Mete Celik
    [J]. Neural Computing and Applications, 2022, 34 : 13909 - 13923
  • [2] RSigELU: A nonlinear activation function for deep neural networks
    Kilicarslan, Serhat
    Celik, Mete
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 174
  • [3] Learning dynamics of kernel-based deep neural networks in manifolds
    Wei WU
    Xiaoyuan JING
    Wencai DU
    Guoliang CHEN
    [J]. Science China(Information Sciences), 2021, 64 (11) : 105 - 119
  • [4] Learning dynamics of kernel-based deep neural networks in manifolds
    Wu, Wei
    Jing, Xiaoyuan
    Du, Wencai
    Chen, Guoliang
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2021, 64 (11)
  • [5] Learning dynamics of kernel-based deep neural networks in manifolds
    Wei Wu
    Xiaoyuan Jing
    Wencai Du
    Guoliang Chen
    [J]. Science China Information Sciences, 2021, 64
  • [6] Kafnets: Kernel-based non-parametric activation functions for neural networks
    Scardapane, Simone
    Van Vaerenbergh, Steven
    Totaro, Simone
    Uncini, Aurelio
    [J]. NEURAL NETWORKS, 2019, 110 : 19 - 32
  • [7] An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
    Chai, Enhui
    Yu, Wei
    Cui, Tianxiang
    Ren, Jianfeng
    Ding, Shusheng
    [J]. SYMMETRY-BASEL, 2022, 14 (05):
  • [8] Learning Kernel-Based Embeddings in Graph Neural Networks
    Navarin, Nicole
    Dinh Van Tran
    Sperduti, Alessandro
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1387 - 1394
  • [9] V-SKP: Vectorized Kernel-Based Structured Kernel Pruning for Accelerating Deep Convolutional Neural Networks
    Koo, Kwanghyun
    Kim, Hyun
    [J]. IEEE ACCESS, 2023, 11 : 118547 - 118557
  • [10] Parametric RSigELU: a new trainable activation function for deep learning
    Kilicarslan, Serhat
    Celik, Mete
    [J]. NEURAL COMPUTING & APPLICATIONS, 2024, 36 (13): : 7595 - 7607