Elastic Adaptively Parametric Compounded Units for Convolutional Neural Network

被引:0
|
作者
Zhang, Changfan [1 ]
Xu, Yifu [1 ]
Sheng, Zhenwen [2 ]
机构
[1] Hunan Univ Technol, 88 Taishan Xi Rd, Zhuzhou 412007, Hunan, Peoples R China
[2] Shandong Xiehe Univ, Coll Engn, 6277 Jiqing Rd, Jinan 250109, Shandong, Peoples R China
关键词
activation function; deep learning; SENet; convolutional neural network; LINEAR UNITS;
D O I
10.20965/jaciii.2023.p0576
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The activation function introduces nonlinearity into convolutional neural network, which greatly promotes the development of computer vision tasks. This paper proposes elastic adaptively parametric compounded units to improve the performance of convolutional neural networks for image recognition. The activation function takes the structural advantages of two mainstream functions as the function's fundamental architecture. The SENet model is embedded in the proposed activation function to adaptively recalibrate the feature mapping weight in each channel, thereby enhancing the fitting capability of the activation function. In addition, the function has an elastic slope in the positive input region by simulating random noise to improve the generalization capability of neural networks. To prevent the generated noise from producing overly large variations during training, a special protection mechanism is adopted. In order to verify the effectiveness of the activation function, this paper uses CIFAR-10 and CIFAR-100 image datasets to conduct comparative experiments of the activation function under the exact same model. Experimental results show that the proposed activation function showed superior performance beyond other functions.
引用
收藏
页码:576 / 584
页数:9
相关论文
共 50 条
  • [1] Parametric elastic full waveform inversion with convolutional neural network
    Keting Guo
    Zhaoyun Zong
    Jidong Yang
    Yuanyuan Tan
    [J]. Acta Geophysica, 2024, 72 : 673 - 687
  • [2] Parametric elastic full waveform inversion with convolutional neural network
    Guo, Keting
    Zong, Zhaoyun
    Yang, Jidong
    Tan, Yuanyuan
    [J]. ACTA GEOPHYSICA, 2024, 72 (02) : 673 - 687
  • [3] Elastic exponential linear units for convolutional neural networks
    Kim, Daeho
    Kim, Jinah
    Kim, Jaeil
    [J]. NEUROCOMPUTING, 2020, 406 : 253 - 266
  • [4] Adaptively Tuning a Convolutional Neural Network by Gate Process for Image Denoising
    Kim, Yoonsik
    Soh, Jae Woong
    Cho, Nam Ik
    [J]. IEEE ACCESS, 2019, 7 : 63447 - 63456
  • [5] Convolutional neural network with nonlinear competitive units
    Chen, Zhang-Ling
    Wang, Jun
    Li, Wen-Juan
    Li, Nan
    Wu, Hua-Ming
    Wang, Da-Wei
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2018, 60 : 193 - 198
  • [6] ADAPTIVELY TUNING A CONVOLUTIONAL NEURAL NETWORK BY GATING PROCESS FOR IMAGE DENOISING
    Kim, Yoonsik
    Soh, Jae Woong
    Cho, Nam Ik
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 1800 - 1804
  • [7] ADNet: Adaptively Dense Convolutional Neural Networks
    Wang, Mingjie
    Cai, Hao
    Huang, Xin
    Gong, Minglun
    [J]. 2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 990 - 999
  • [8] AN ADAPTIVELY TRAINED NEURAL NETWORK
    PARK, DC
    ELSHARKAWI, MA
    MARKS, RJ
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1991, 2 (03): : 334 - 345
  • [9] Edges Detection of Nanowires and Adaptively Denoising with Deep Convolutional Neural Network from SEM Images
    Qu, Zhi
    Yang, Zhan
    Ru, Changhai
    [J]. 20TH IEEE INTERNATIONAL CONFERENCE ON NANOTECHNOLOGY (IEEE NANO 2020), 2020, : 146 - 149
  • [10] Multiple sclerosis identification by convolutional neural network with dropout and parametric ReLU
    Zhang, Yu-Dong
    Pan, Chichun
    Sun, Junding
    Tang, Chaosheng
    [J]. JOURNAL OF COMPUTATIONAL SCIENCE, 2018, 28 : 1 - 10