Activation function optimization scheme for image classification

被引:0
|
作者
Rahman, Abdur [1 ]
He, Lu [2 ]
Wang, Haifeng [1 ]
机构
[1] Mississippi State Univ, Dept Ind & Syst Engn, Mississippi State, MS 39762 USA
[2] Mississippi State Univ, Dept Mkt Quantitat Anal & Business Law, Mississippi State, MS 39762 USA
关键词
Activation function; Evolutionary approach; Exponential Error Linear Unit (EELU); Genetic algorithm; NEURAL-NETWORK;
D O I
10.1016/j.knosys.2024.112502
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Activation function has a significant impact on the dynamics, convergence, and performance of deep neural networks. The search for a consistent and high-performing activation function has always been a pursuit during deep learning model development. Existing state-of-the-art activation functions are manually designed with human expertise except for Swish. Swish was developed using a reinforcement learning-based search strategy. In this study, we propose an evolutionary approach for optimizing activation functions specifically for image classification tasks, aiming to discover functions that outperform current state-of-the-art options. Through this optimization framework, we obtain a series of high-performing activation functions denoted as Exponential Error Linear Unit (EELU). The developed activation functions are evaluated for image classification tasks from two perspectives: (1) five state-of-the-art neural network architectures, such as ResNet50, AlexNet, VGG16, MobileNet, and Compact Convolutional Transformer, which cover computationally heavy to light neural networks, and (2) eight standard datasets, including CIFAR10, Imagenette, MNIST, Fashion MNIST, Beans, Colorectal Histology, CottonWeedID15, and TinyImageNet which cover from typical machine vision benchmark, agricultural image applications to medical image applications. Finally, we statistically investigate the generalization of the resultant activation functions developed through the optimization scheme. With a Friedman test, we conclude that the optimization scheme is able to generate activation functions that outperform the existing standard ones in 92.8% cases among 28 different cases studied, and - x & sdot; erf(e-x) ( e - x ) is found to be the best activation function for image classification generated by the optimization scheme.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Gish: a novel activation function for image classification
    Kaytan, Mustafa
    Aydilek, Ibrahim Berkan
    Yeroglu, Celaleddin
    NEURAL COMPUTING & APPLICATIONS, 2023,
  • [2] Design of activation function in CNN for image classification
    Wang H.-X.
    Zhou J.-Q.
    Gu C.-H.
    Lin H.
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2019, 53 (07): : 1363 - 1373
  • [3] Gish: a novel activation function for image classification
    Kaytan, Mustafa
    Aydilek, Ibrahim Berkan
    Yeroglu, Celaleddin
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (34): : 24259 - 24281
  • [4] An Adaptive Offset Activation Function for CNN Image Classification Tasks
    Jiang, Yuanyuan
    Xie, Jinyang
    Zhang, Dong
    ELECTRONICS, 2022, 11 (22)
  • [5] Boosting image classification scheme
    Qiu, XP
    Feng, Z
    Wu, LD
    2004 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXP (ICME), VOLS 1-3, 2004, : 1271 - 1274
  • [6] Optimal evolutionary framework-based activation function for image classification
    Parisi, Luca
    Neagu, Ciprian Daniel
    RaviChandran, Narrendar
    Ma, Renfei
    Campean, Felician
    KNOWLEDGE-BASED SYSTEMS, 2024, 299
  • [7] A geometric invariant scheme for image classification
    Pun, CM
    Wong, CT
    Vision '05: Proceedings of the 2005 International Conference on Computer Vision, 2005, : 71 - 77
  • [8] High efficient activation function design for CNN model image classification task
    Du S.
    Jia X.
    Huang Y.
    Guo Y.
    Zhao B.
    Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering, 2022, 51 (03):
  • [9] Performance analysis of nonlinear activation function in convolution neural network for image classification
    Too, Edna C.
    Li, Yujian
    Gadosey, Pius Kwao
    Njuki, Sam
    Essaf, Firdaous
    INTERNATIONAL JOURNAL OF COMPUTATIONAL SCIENCE AND ENGINEERING, 2020, 21 (04) : 522 - 535
  • [10] The Convolution Neural Network with Transformed Exponential Linear Unit Activation Function for Image Classification
    Too, Edna C.
    Li, Yujian
    Njuki, Sam
    Yamak, Peter T.
    Zhang, Ting
    PROCEEDINGS OF 2019 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO AND SIGNAL PROCESSING (IVSP 2019), 2019, : 55 - 62