Deep delay rectified neural networks

被引:0
|
作者
Chuanhui Shan
Ao Li
Xiumei Chen
机构
[1] Anhui Polytechnic University,School of Electrical Engineering
[2] Anhui Polytechnic University,School of Biological and Food Engineering
来源
关键词
Excitation response threshold; Delay rectified linear unit (DRLU); DRLU neuron; DRLU network;
D O I
暂无
中图分类号
学科分类号
摘要
An activation function is one of the key factors for the success in deep learning. According to the neurobiology research, biological neurons don’t respond to external stimuli in the initial stage and then respond to the stimulus intensity while it reaches a certain value. However, the rectified linear unit (ReLU) series activation functions, such as ReLU, LReLU, PReLU, ELU, SReLU, and MPELU, cannot meet the response characteristics of biological neurons. To address this problem, a delay rectified linear unit (DRLU) activation function with the excitation response threshold is proposed, based on the ReLU activation function. The DRLU activation function is more consistent with the response characteristics of biological neurons and more flexible compared with the ReLU activation function. The experimental results show that the DRLU activation function has better performance than the ReLU activation function in accuracy, training time, and convergence on different datasets, such as MNIST, Fashion-MNIST, SVHN, CALTECH101, and FLOWER102. The DRLU activation function also provides viewpoints and references to the excitation response threshold of LReLU, PReLU, ELU, SReLU, and MPELU.
引用
收藏
页码:880 / 896
页数:16
相关论文
共 50 条
  • [21] Spam Filtering Using Regularized Neural Networks with Rectified Linear Units
    Barushka, Aliaksandr
    Hajek, Petr
    [J]. AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 65 - 75
  • [22] Delay Differential Neural Networks
    Anumasa, Srinivas
    Srijith, P. K.
    [J]. PROCEEDINGS OF 2021 6TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING TECHNOLOGIES (ICMLT 2021), 2021, : 117 - 121
  • [23] Graph-adaptive Rectified Linear Unit for Graph Neural Networks
    Zhang, Yifei
    Zhu, Hao
    Meng, Ziqiao
    Koniusz, Piotr
    King, Irwin
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1331 - 1339
  • [24] FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
    Qiu, Suo
    Xu, Xiangmin
    Cai, Bolun
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1223 - 1228
  • [25] Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations
    Vershynin, Roman
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04): : 1004 - 1033
  • [26] Exploring Normalization in Deep Residual Networks with Concatenated Rectified Linear Units
    Shang, Wenling
    Chiu, Justin
    Sohn, Kihyuk
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1509 - 1516
  • [27] Rectified Factor Networks
    Clevert, Djork-Arne
    Mayr, Andreas
    Unterthiner, Thomas
    Hochreiter, Sepp
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [28] Deep learning neural networks for emotion classification from text: enhanced leaky rectified linear unit activation and weighted loss
    Hui Yang
    Abeer Alsadoon
    P. W. C. Prasad
    Thair Al-Dala’in
    Tarik A. Rashid
    Angelika Maag
    Omar Hisham Alsadoon
    [J]. Multimedia Tools and Applications, 2022, 81 : 15439 - 15468
  • [29] Deep learning neural networks for emotion classification from text: enhanced leaky rectified linear unit activation and weighted loss
    Yang, Hui
    Alsadoon, Abeer
    Prasad, P. W. C.
    Al-Dala'in, Thair
    Rashid, Tarik A.
    Maag, Angelika
    Alsadoon, Omar Hisham
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (11) : 15439 - 15468
  • [30] LOS Delay Estimation using Super Resolution Deep Neural Networks for Precise Positioning
    Yerramalli, Srinivas
    Yoo, Taesang
    Ferrari, Lorenzo
    [J]. 2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,