Deep delay rectified neural networks

被引:0
|
作者
Chuanhui Shan
Ao Li
Xiumei Chen
机构
[1] Anhui Polytechnic University,School of Electrical Engineering
[2] Anhui Polytechnic University,School of Biological and Food Engineering
来源
关键词
Excitation response threshold; Delay rectified linear unit (DRLU); DRLU neuron; DRLU network;
D O I
暂无
中图分类号
学科分类号
摘要
An activation function is one of the key factors for the success in deep learning. According to the neurobiology research, biological neurons don’t respond to external stimuli in the initial stage and then respond to the stimulus intensity while it reaches a certain value. However, the rectified linear unit (ReLU) series activation functions, such as ReLU, LReLU, PReLU, ELU, SReLU, and MPELU, cannot meet the response characteristics of biological neurons. To address this problem, a delay rectified linear unit (DRLU) activation function with the excitation response threshold is proposed, based on the ReLU activation function. The DRLU activation function is more consistent with the response characteristics of biological neurons and more flexible compared with the ReLU activation function. The experimental results show that the DRLU activation function has better performance than the ReLU activation function in accuracy, training time, and convergence on different datasets, such as MNIST, Fashion-MNIST, SVHN, CALTECH101, and FLOWER102. The DRLU activation function also provides viewpoints and references to the excitation response threshold of LReLU, PReLU, ELU, SReLU, and MPELU.
引用
收藏
页码:880 / 896
页数:16
相关论文
共 50 条
  • [31] LOS Delay Estimation using Super Resolution Deep Neural Networks for Precise Positioning
    Yerramalli, Srinivas
    Yoo, Taesang
    Ferrari, Lorenzo
    [J]. 2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [32] Group Delay based Music Source Separation using Deep Recurrent Neural Networks
    Sebastian, Jilt
    Murthy, Hema A.
    [J]. 2016 INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS (SPCOM), 2016,
  • [33] Rectified deep neural networks overcome the curse of dimensionality when approximating solutions of McKean-Vlasov stochastic differential equations ☆
    Neufeld, Ariel
    Nguyen, Tuan Anh
    [J]. JOURNAL OF MATHEMATICAL ANALYSIS AND APPLICATIONS, 2025, 541 (01)
  • [34] Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
    Shang, Wenling
    Sohn, Kihyuk
    Almeida, Diogo
    Lee, Honglak
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [35] Rectified Attention Gate Unit in Recurrent Neural Networks for Effective Attention Computation
    Ha, Manh-Hung
    Chen, Oscal Tzyh-Chiang
    [J]. 2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 81 - 84
  • [36] ACTIVE NEURON LEAST SQUARES: A TRAINING METHOD FOR MULTIVARIATE RECTIFIED NEURAL NETWORKS
    Ainsworth, Mark
    Shin, Yeonjong
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2022, 44 (04): : A2253 - A2275
  • [37] Gradient rectified parameter unit of the fully connected layer in convolutional neural networks
    Zheng, Tianyou
    Wang, Qiang
    Shen, Yue
    Lin, Xiaotian
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 248
  • [38] Stability of stochastic delay neural networks
    Blythe, S
    Mao, XR
    Liao, XX
    [J]. JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2001, 338 (04): : 481 - 495
  • [39] Stability for Cellular Neural Networks with Delay
    杨金祥
    钟守铭
    鄢克雨
    [J]. Journal of Electronic Science and Technology, 2005, (02) : 123 - 125
  • [40] Fast time delay neural networks
    El-Bakry, HM
    Zhao, QF
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2005, 15 (06) : 445 - 455