Deep delay rectified neural networks

被引:0
|
作者
Chuanhui Shan
Ao Li
Xiumei Chen
机构
[1] Anhui Polytechnic University,School of Electrical Engineering
[2] Anhui Polytechnic University,School of Biological and Food Engineering
来源
关键词
Excitation response threshold; Delay rectified linear unit (DRLU); DRLU neuron; DRLU network;
D O I
暂无
中图分类号
学科分类号
摘要
An activation function is one of the key factors for the success in deep learning. According to the neurobiology research, biological neurons don’t respond to external stimuli in the initial stage and then respond to the stimulus intensity while it reaches a certain value. However, the rectified linear unit (ReLU) series activation functions, such as ReLU, LReLU, PReLU, ELU, SReLU, and MPELU, cannot meet the response characteristics of biological neurons. To address this problem, a delay rectified linear unit (DRLU) activation function with the excitation response threshold is proposed, based on the ReLU activation function. The DRLU activation function is more consistent with the response characteristics of biological neurons and more flexible compared with the ReLU activation function. The experimental results show that the DRLU activation function has better performance than the ReLU activation function in accuracy, training time, and convergence on different datasets, such as MNIST, Fashion-MNIST, SVHN, CALTECH101, and FLOWER102. The DRLU activation function also provides viewpoints and references to the excitation response threshold of LReLU, PReLU, ELU, SReLU, and MPELU.
引用
收藏
页码:880 / 896
页数:16
相关论文
共 50 条
  • [1] Deep delay rectified neural networks
    Shan, Chuanhui
    Li, Ao
    Chen, Xiumei
    [J]. JOURNAL OF SUPERCOMPUTING, 2023, 79 (01): : 880 - 896
  • [2] Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units
    Xu, Yixi
    Wang, Xiao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Deep neural networks with Elastic Rectified Linear Units for object recognition
    Jiang, Xiaoheng
    Pang, Yanwei
    Li, Xuelong
    Pan, Jing
    Xie, Yinghong
    [J]. NEUROCOMPUTING, 2018, 275 : 1132 - 1139
  • [4] Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks
    Zhang, Malu
    Wang, Jiadong
    Wu, Jibin
    Belatreche, Ammar
    Amornpaisannon, Burin
    Zhang, Zhixuan
    Miriyala, Venkata Pavan Kumar
    Qu, Hong
    Chua, Yansong
    Carlson, Trevor E.
    Li, Haizhou
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (05) : 1947 - 1958
  • [5] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING RECTIFIED LINEAR UNITS AND DROPOUT
    Dahl, George E.
    Sainath, Tara N.
    Hinton, Geoffrey E.
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8609 - 8613
  • [6] Solving Parametric Partial Differential Equations with Deep Rectified Quadratic Unit Neural Networks
    Zhen Lei
    Lei Shi
    Chenyu Zeng
    [J]. Journal of Scientific Computing, 2022, 93
  • [7] Solving Parametric Partial Differential Equations with Deep Rectified Quadratic Unit Neural Networks
    Lei, Zhen
    Shi, Lei
    Zeng, Chenyu
    [J]. JOURNAL OF SCIENTIFIC COMPUTING, 2022, 93 (03)
  • [8] Self-gated rectified linear unit for performance improvement of deep neural networks
    Jahan, Israt
    Ahmed, Md. Faisal
    Ali, Md. Osman
    Jang, Yeong Min
    [J]. ICT EXPRESS, 2023, 9 (03): : 320 - 325
  • [9] Rectified Exponential Units for Convolutional Neural Networks
    Ying, Yao
    Su, Jianlin
    Shan, Peng
    Miao, Ligang
    Wang, Xiaolian
    Peng, Silong
    [J]. IEEE ACCESS, 2019, 7 : 101633 - 101640
  • [10] PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units
    Li, Bo
    Tang, Shanshan
    Yu, Haijun
    [J]. JOURNAL OF MATHEMATICAL STUDY, 2020, 53 (02) : 159 - 191