Deep delay rectified neural networks

被引:0
|
作者
Chuanhui Shan
Ao Li
Xiumei Chen
机构
[1] Anhui Polytechnic University,School of Electrical Engineering
[2] Anhui Polytechnic University,School of Biological and Food Engineering
来源
关键词
Excitation response threshold; Delay rectified linear unit (DRLU); DRLU neuron; DRLU network;
D O I
暂无
中图分类号
学科分类号
摘要
An activation function is one of the key factors for the success in deep learning. According to the neurobiology research, biological neurons don’t respond to external stimuli in the initial stage and then respond to the stimulus intensity while it reaches a certain value. However, the rectified linear unit (ReLU) series activation functions, such as ReLU, LReLU, PReLU, ELU, SReLU, and MPELU, cannot meet the response characteristics of biological neurons. To address this problem, a delay rectified linear unit (DRLU) activation function with the excitation response threshold is proposed, based on the ReLU activation function. The DRLU activation function is more consistent with the response characteristics of biological neurons and more flexible compared with the ReLU activation function. The experimental results show that the DRLU activation function has better performance than the ReLU activation function in accuracy, training time, and convergence on different datasets, such as MNIST, Fashion-MNIST, SVHN, CALTECH101, and FLOWER102. The DRLU activation function also provides viewpoints and references to the excitation response threshold of LReLU, PReLU, ELU, SReLU, and MPELU.
引用
收藏
页码:880 / 896
页数:16
相关论文
共 50 条
  • [11] PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units
    Li, Bo
    Tang, Shanshan
    Yu, Haijun
    [J]. JOURNAL OF MATHEMATICAL STUDY, 2020, 53 (02) : 159 - 191
  • [12] Delay Compensated Asynchronous Adam Algorithm for Deep Neural Networks
    Shan, Lei
    Guan, Naiyang
    Yang, Canqun
    Xu, Weixia
    Zhang, Minxuan
    [J]. 2017 15TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS AND 2017 16TH IEEE INTERNATIONAL CONFERENCE ON UBIQUITOUS COMPUTING AND COMMUNICATIONS (ISPA/IUCC 2017), 2017, : 852 - 859
  • [13] Coupled Nonlinear Delay Systems as Deep Convolutional Neural Networks
    Penkovsky, Bogdan
    Porte, Xavier
    Jacquot, Maxime
    Larger, Laurent
    Brunner, Daniel
    [J]. PHYSICAL REVIEW LETTERS, 2019, 123 (05)
  • [14] A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations
    Hutzenthaler, Martin
    Jentzen, Arnulf
    Kruse, Thomas
    Nguyen, Tuan Anh
    [J]. PARTIAL DIFFERENTIAL EQUATIONS AND APPLICATIONS, 2020, 1 (02):
  • [15] Parametric rectified nonlinear unit (PRenu) for convolution neural networks
    Ilyas El Jaafari
    Ayoub Ellahyani
    Said Charfi
    [J]. Signal, Image and Video Processing, 2021, 15 : 241 - 246
  • [16] Parametric rectified nonlinear unit (PRenu) for convolution neural networks
    El Jaafari, Ilyas
    Ellahyani, Ayoub
    Charfi, Said
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (02) : 241 - 246
  • [17] Learning Two Layer Rectified Neural Networks in Polynomial Time
    Bakshi, Ainesh
    Jayaram, Rajesh
    Woodruff, David P.
    [J]. CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [18] Facial Expressions Recognition for Human-Robot Interaction Using Deep Convolutional Neural Networks with Rectified Adam Optimizer
    Melinte, Daniel Octavian
    Vladareanu, Luige
    [J]. SENSORS, 2020, 20 (08)
  • [19] Rectified Linear Neural Networks with Tied-Scalar Regularization for LVCSR
    Zhang, Shiliang
    Jiang, Hui
    Wei, Si
    Dai, Li-Rong
    [J]. 16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 2635 - 2639
  • [20] Rectified-linear and Recurrent Neural Networks Built with Spin Devices
    Dong, Qing
    Yang, Kaiyuan
    Fick, Laura
    Blaauw, David
    Sylvester, Dennis
    [J]. 2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2017, : 2492 - 2495