FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks

被引:0
|
作者
Qiu, Suo [1 ]
Xu, Xiangmin [1 ]
Cai, Bolun [1 ]
机构
[1] South China Univ Technol, Sch Elect & Informat Engn, Wushan RD, Guangzhou, Guangdong, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. However, because of the zero-hard rectification, ReLU networks lose the benefits from negative values. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU) to further explore the effects of negative values. By redesigning the rectified point of ReLU as a learnable parameter, FReLU expands the states of the activation output. When a network is successfully trained, FReLU tends to converge to a negative value, which improves the expressiveness and thus the performance. Furthermore, FReLU is designed to be simple and effective without exponential functions to maintain low-cost computation. For being able to easily used in various network architectures, FReLU does not rely on strict assumptions by self-adaption. We evaluate FReLU on three standard image classification datasets, including CIFAR-10, CIFAR-100, and ImageNet. Experimental results show that FReLU achieves fast convergence and competitive performance on both plain and residual networks.
引用
收藏
页码:1223 / 1228
页数:6
相关论文
共 50 条
  • [1] Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units
    Shang, Wenling
    Sohn, Kihyuk
    Almeida, Diogo
    Lee, Honglak
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [2] Rectified Exponential Units for Convolutional Neural Networks
    Ying, Yao
    Su, Jianlin
    Shan, Peng
    Miao, Ligang
    Wang, Xiaolian
    Peng, Silong
    [J]. IEEE ACCESS, 2019, 7 : 101633 - 101640
  • [3] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING RECTIFIED LINEAR UNITS AND DROPOUT
    Dahl, George E.
    Sainath, Tara N.
    Hinton, Geoffrey E.
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8609 - 8613
  • [4] Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units
    Xu, Yixi
    Wang, Xiao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Spam Filtering Using Regularized Neural Networks with Rectified Linear Units
    Barushka, Aliaksandr
    Hajek, Petr
    [J]. AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 65 - 75
  • [6] Deep neural networks with Elastic Rectified Linear Units for object recognition
    Jiang, Xiaoheng
    Pang, Yanwei
    Li, Xuelong
    Pan, Jing
    Xie, Yinghong
    [J]. NEUROCOMPUTING, 2018, 275 : 1132 - 1139
  • [7] Hyperbolic Linear Units for Deep Convolutional Neural Networks
    Li, Jia
    Xu, Hua
    Deng, Junhui
    Sun, Xiaomin
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 353 - 359
  • [8] Elastic exponential linear units for convolutional neural networks
    Kim, Daeho
    Kim, Jinah
    Kim, Jaeil
    [J]. NEUROCOMPUTING, 2020, 406 : 253 - 266
  • [9] Improving deep convolutional neural networks with mixed maxout units
    Zhao, Hui-zhen
    Liu, Fu-xian
    Li, Long-yue
    [J]. PLOS ONE, 2017, 12 (07):
  • [10] LDConv: Linear deformable convolution for improving convolutional neural networks
    Zhang, Xin
    Song, Yingze
    Song, Tingting
    Yang, Degang
    Ye, Yichen
    Zhou, Jie
    Zhang, Liming
    [J]. IMAGE AND VISION COMPUTING, 2024, 149