Rectified Exponential Units for Convolutional Neural Networks

被引:21
|
作者
Ying, Yao [1 ]
Su, Jianlin [2 ]
Shan, Peng [1 ]
Miao, Ligang [3 ]
Wang, Xiaolian [4 ]
Peng, Silong [1 ,4 ]
机构
[1] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Liaoning, Peoples R China
[2] Sun Yat Sen Univ, Sch Math, Guangzhou 510220, Guangdong, Peoples R China
[3] Northeastern Univ, Sch Comp & Commun Engn, Shenyang 110819, Liaoning, Peoples R China
[4] Chinese Acad Sci, Inst Automat, Beijing 100190, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
基金
中国国家自然科学基金;
关键词
Activation function; convolutional neural network; rectified exponential unit; parametric rectified exponential unit;
D O I
10.1109/ACCESS.2019.2928442
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Rectified linear unit (ReLU) plays an important role in today's convolutional neural networks (CNNs). In this paper, we propose a novel activation function called Rectified Exponential Unit (REU). Inspired by two recently proposed activation functions: Exponential Linear Unit (ELU) and Swish, the REU is designed by introducing the advantage of flexible exponent and multiplication function form. Moreover, we propose the Parametric REU (PREU) to increase the expressive power of the REU. The experiments with three classical CNN architectures, LeNet-5, Network in Network, and Residual Network (ResNet) on scale-various benchmarks including Fashion-MNIST, CIFAR10, CIFAR100, and Tiny ImageNet demonstrate that REU and PREU achieve improvement compared with other activation functions. Our results show that our REU has relative error improvements over ReLU of 7.74% and 6.08% on CIFAR-10 and 100 with the ResNet, the improvements of PREU is 9.24% and 9.32%. Finally, we use the different PREU variants in the Residual unit to achieve more stable results.
引用
收藏
页码:101633 / 101640
页数:8
相关论文
共 50 条
  • [31] Convolutional neural networks
    Alexander Derry
    Martin Krzywinski
    Naomi Altman
    [J]. Nature Methods, 2023, 20 : 1269 - 1270
  • [32] Dual Rectified Linear Units (DReLUs): A replacement for tanh activation functions in Quasi-Recurrent Neural Networks
    Godin, Frederic
    Degrave, Jonas
    Dambre, Joni
    De Neye, Wesley
    [J]. PATTERN RECOGNITION LETTERS, 2018, 116 : 8 - 14
  • [33] Convolutional neural networks
    Derry, Alexander
    Krzywinski, Martin
    Altman, Naomi
    [J]. NATURE METHODS, 2023, 20 (09) : 1269 - 1270
  • [34] Predictive Controller Based on Feedforward Neural Network with Rectified Linear Units
    Dolezel, Petr
    Honc, Daniel
    Stursa, Dominik
    [J]. INTELLIGENT SYSTEMS APPLICATIONS IN SOFTWARE ENGINEERING, VOL 1, 2019, 1046 : 1 - 12
  • [35] DIRECTION FINDING USING CONVOLUTIONAL NEURAL NETWORKS and CONVOLUTIONAL RECURRENT NEURAL NETWORKS
    Uckun, Fehmi Ayberk
    Ozer, Hakan
    Nurbas, Ekin
    Onat, Emrah
    [J]. 2020 28TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2020,
  • [36] Facial Expressions Recognition for Human-Robot Interaction Using Deep Convolutional Neural Networks with Rectified Adam Optimizer
    Melinte, Daniel Octavian
    Vladareanu, Luige
    [J]. SENSORS, 2020, 20 (08)
  • [37] Application of Bit-Serial Arithmetic Units for FPGA Implementation of Convolutional Neural Networks
    Csordas, G.
    Feher, B.
    Kovacshazy, T.
    [J]. 2018 19TH INTERNATIONAL CARPATHIAN CONTROL CONFERENCE (ICCC), 2018, : 322 - 327
  • [38] CNNG: A Convolutional Neural Networks With Gated Recurrent Units for Autism Spectrum Disorder Classification
    Jiang, Wenjing
    Liu, Shuaiqi
    Zhang, Hong
    Sun, Xiuming
    Wang, Shui-Hua
    Zhao, Jie
    Yan, Jingwen
    [J]. FRONTIERS IN AGING NEUROSCIENCE, 2022, 14
  • [39] The Impact of Soft Errors in Memory Units of Edge Devices Executing Convolutional Neural Networks
    Abich, Geancarlo
    Garibotti, Rafael
    Reis, Ricardo
    Ost, Luciano
    [J]. IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2022, 69 (03) : 679 - 683
  • [40] Optimizing Convolutional Neural Networks for Image Classification on Resource-Constrained Microcontroller Units
    Brockmann, Susanne
    Schlippe, Tim
    [J]. COMPUTERS, 2024, 13 (07)