NIPUNA: A Novel Optimizer Activation Function for Deep Neural Networks

被引:4
|
作者
Madhu, Golla [1 ]
Kautish, Sandeep [2 ,3 ]
Alnowibet, Khalid Abdulaziz [4 ]
Zawbaa, Hossam M. M. [5 ]
Mohamed, Ali Wagdy [6 ,7 ]
机构
[1] VNR Vignana Jyothi Inst Engn & Technol, Dept Informat Technol, Hyderabad 500090, Telangana, India
[2] LBEF Campus, Kathmandu 44600, Nepal
[3] Asia Pacific Univ Technol & Innovat, Kuala Lumpur, Malaysia
[4] King Saud Univ, Coll Sci, Stat & Operat Res Dept, POB 2455, Riyadh 11451, Saudi Arabia
[5] Technol Univ Dublin, CeADAR Irelands Ctr Appl AI, Dublin D7 EWV4, Ireland
[6] Cairo Univ, Fac Grad Studies Stat Res, Operat Res Dept, Giza 12613, Egypt
[7] Amer Univ Cairo, Sch Sci & Engn, Dept Math & Actuarial Sci, Cairo 11835, Egypt
关键词
convolutional neural networks; deep neural networks; NIPUNA; periodic function;
D O I
10.3390/axioms12030246
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In recent years, various deep neural networks with different learning paradigms have been widely employed in various applications, including medical diagnosis, image analysis, self-driving vehicles and others. The activation functions employed in deep neural networks have a huge impact on the training model and the reliability of the model. The Rectified Linear Unit (ReLU) has recently emerged as the most popular and extensively utilized activation function. ReLU has some flaws, such as the fact that it is only active when the units are positive during back-propagation and zero otherwise. This causes neurons to die (dying ReLU) and a shift in bias. However, unlike ReLU activation functions, Swish activation functions do not remain stable or move in a single direction. This research proposes a new activation function named NIPUNA for deep neural networks. We test this activation by training on customized convolutional neural networks (CCNN). On benchmark datasets (Fashion MNIST images of clothes, MNIST dataset of handwritten digits), the contributions are examined and compared to various activation functions. The proposed activation function can outperform traditional activation functions.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A Novel Activation Function of Deep Neural Network
    Xiangyang, Lin
    Xing, Qinghua
    Han, Zhang
    Feng, Chen
    [J]. Scientific Programming, 2023, 2023
  • [2] Improved Adam Optimizer for Deep Neural Networks
    Zhang, Zijun
    [J]. 2018 IEEE/ACM 26TH INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE (IWQOS), 2018,
  • [3] RSigELU: A nonlinear activation function for deep neural networks
    Kilicarslan, Serhat
    Celik, Mete
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 174
  • [4] On the Impact of the Activation Function on Deep Neural Networks Training
    Hayou, Soufiane
    Doucet, Arnaud
    Rousseau, Judith
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [5] A novel type of activation function in artificial neural networks: Trained activation function
    Ertugrul, Omer Faruk
    [J]. NEURAL NETWORKS, 2018, 99 : 148 - 157
  • [6] A Novel Posit-based Fast Approximation of ELU Activation Function for Deep Neural Networks
    Cococcioni, Marco
    Rossi, Federico
    Ruffaldi, Emanuele
    Saponara, Sergio
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SMART COMPUTING (SMARTCOMP), 2020, : 244 - 246
  • [7] An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks
    Chai, Enhui
    Yu, Wei
    Cui, Tianxiang
    Ren, Jianfeng
    Ding, Shusheng
    [J]. SYMMETRY-BASEL, 2022, 14 (05):
  • [8] Regularized Flexible Activation Function Combination for Deep Neural Networks
    Jie, Renlong
    Gao, Junbin
    Vasnev, Andrey
    Tran, Minh-ngoc
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 2001 - 2008
  • [9] A Novel Learning Algorithm to Optimize Deep Neural Networks: Evolved Gradient Direction Optimizer (EVGO)
    Karabayir, Ibrahim
    Akbilgic, Oguz
    Tas, Nihat
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (02) : 685 - 694
  • [10] A Novel Hybrid Analog Design Optimizer with Particle Swarm Optimization and modern Deep Neural Networks
    Elsiginy, Ahmed
    Elmahdy, Mohamed
    Azab, Eman
    [J]. 2019 INTERNATIONAL SOC DESIGN CONFERENCE (ISOCC), 2019, : 212 - 213