RMAF: Relu-Memristor-Like Activation Function for Deep Learning

被引:51
|
作者
Yu, Yongbin [1 ]
Adu, Kwabena [1 ]
Tashi, Nyima [2 ]
Anokye, Patrick [1 ]
Wang, Xiangxiang [1 ]
Ayidzoe, Mighty Abra [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Informat & Software Engn, Chengdu 610054, Peoples R China
[2] Tibet Univ, Sch Informat Sci & Technol, Lhasa 850000, Peoples R China
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Biological neural networks; Training; Neurons; Machine learning; Task analysis; Optimization; Activation function; deep learning; memristive window function; muilti-layer perceptron; RMAF; NETWORKS; UNITS;
D O I
10.1109/ACCESS.2020.2987829
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Activation functions facilitate deep neural networks by introducing non-linearity to the learning process. The non-linearity feature gives the neural network the ability to learn complex patterns. Recently, the most widely used activation function is the Rectified Linear Unit (ReLU). Though, other various existing activation including hand-designed alternatives to ReLU have been proposed. However, none has succeeded in replacing ReLU due to their existing inconsistencies. In this work, activation function called ReLU-Memristor-like Activation Function (RMAF) is proposed to leverage benefits of negative values in neural networks. RMAF introduces a constant parameter and a threshold parameter making the function smooth, non-monotonous, and introduces non-linearity in the network. Our experiments show that, the RMAF works better than ReLU and other activation functions on deeper models and across number of challenging datasets. Firstly, experiments are performed by training and classifying on multi-layer perceptron (MLP) over benchmark data such as the Wisconsin breast cancer, MNIST, Iris and Car evaluation. RMAF achieves high performance of 98.74 & x0025;, 99.67 & x0025;, 98.81 & x0025; and 99.42 & x0025; respectively, compared to Sigmoid, Tanh and ReLU. Secondly, experiments were performed on convolution neural network (ResNet) over MNIST, CIFAR-10 and CIFAR-100 data and observed the proposed activation function achieves higher performance accuracy of 99.73 & x0025;, 98.77 & x0025; and 79.82 & x0025; respectively than Tanh, ReLU and Swish. Additionally, we experimented our work on deep networks i.e. squeeze network (SqueezeNet), Dense connected neural network (DenseNet121) and ImageNet dataset, which RMAF produced the best performance. We note that, the RMAF converges faster than the other functions and can replace ReLU in any neural network due to the efficiency, scalability and its similarity to both ReLU and Swish.
引用
收藏
页码:72727 / 72741
页数:15
相关论文
共 50 条
  • [1] Learning algorithm analysis for deep neural network with ReLu activation functions
    Placzek, Stanislaw
    Placzek, Aleksander
    [J]. COMPUTER APPLICATIONS IN ELECTRICAL ENGINEERING (ZKWE'2018), 2018, 19
  • [2] Deep Learning-Based Beamforming for Millimeter-Wave Systems Using Parametric ReLU Activation Function
    Alshimaa H. Ismail
    Tarek Abed Soliman
    Mohamed Rihan
    Moawad I. Dessouky
    [J]. Wireless Personal Communications, 2023, 129 : 825 - 836
  • [3] Deep Learning-Based Beamforming for Millimeter-Wave Systems Using Parametric ReLU Activation Function
    Ismail, Alshimaa H. H.
    Soliman, Tarek Abed
    Rihan, Mohamed
    Dessouky, Moawad I. I.
    [J]. WIRELESS PERSONAL COMMUNICATIONS, 2023, 129 (02) : 825 - 836
  • [4] DISCUSSION OF: "NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION"
    Ghorbani, Behrooz
    Mei, Song
    Misiakiewicz, Theodor
    Montanari, Andrea
    [J]. ANNALS OF STATISTICS, 2020, 48 (04): : 1898 - 1901
  • [5] REJOINDER: "NONPARAMETRIC REGRESSION USING DEEP NEURAL NETWORKS WITH RELU ACTIVATION FUNCTION"
    Schmidt-Hieber, Johannes
    [J]. ANNALS OF STATISTICS, 2020, 48 (04): : 1916 - 1921
  • [6] A comparison of deep networks with ReLU activation function and linear spline-type methods
    Eckle, Konstantin
    Schmidt-Hieber, Johannes
    [J]. NEURAL NETWORKS, 2019, 110 : 232 - 242
  • [7] A Kind of Extreme Learning Machine Based on Memristor Activation Function
    Li, Hanman
    Wang, Lidan
    Duan, ShuKai
    [J]. PROCEEDINGS OF ELM-2017, 2019, 10 : 210 - 218
  • [8] Optical ReLU-like activation function based on a semiconductor laser with optical injection
    Liu, Guan-ting
    Shen, Yi-wei
    Li, Rui-qian
    Yu, Jingyi
    He, Xuming
    Wang, Cheng
    [J]. OPTICS LETTERS, 2024, 49 (04) : 818 - 821
  • [9] An Empirical Study on Generalizations of the ReLU Activation Function
    Banerjee, Chaity
    Mukherjee, Tathagata
    Pasiliao, Eduardo, Jr.
    [J]. PROCEEDINGS OF THE 2019 ANNUAL ACM SOUTHEAST CONFERENCE (ACMSE 2019), 2019, : 164 - 167
  • [10] The Multi-phase ReLU Activation Function
    Banerjee, Chaity
    Mukherjee, Tathagata
    Pasiliao, Eduardo, Jr.
    [J]. ACMSE 2020: PROCEEDINGS OF THE 2020 ACM SOUTHEAST CONFERENCE, 2020, : 239 - 242