Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units

被引:0
|
作者
Shang, Wenling [1 ,4 ]
Sohn, Kihyuk [2 ]
Almeida, Diogo [3 ]
Lee, Honglak [1 ]
机构
[1] Univ Michigan, Ann Arbor, MI 48109 USA
[2] NEC Labs Amer, Irving, TX USA
[3] Enlitic, San Francisco, CA USA
[4] Oculus VR, Menlo Pk, CA 94025 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, convolutional neural networks (CNNs) have been used as a powerful tool to solve many problems of machine learning and computer vision. In this paper, we aim to provide insight on the property of convolutional neural networks, as well as a generic method to improve the performance of many CNN architectures. Specifically, we first examine existing CNN models and observe an intriguing property that the filters in the lower layers form pairs (i.e., filters with opposite phase). Inspired by our observation, we propose a novel, simple yet effective activation scheme called concatenated ReLU (CReLU) and theoretically analyze its reconstruction property in CNNs. We integrate CReLU into several state-of-the-art CNN architectures and demonstrate improvement in their recognition performance on CIFAR-10/100 and ImageNet datasets with fewer trainable parameters. Our results suggest that better understanding of the properties of CNNs can lead to significant performance improvement with a simple modification.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] FReLU: Flexible Rectified Linear Units for Improving Convolutional Neural Networks
    Qiu, Suo
    Xu, Xiangmin
    Cai, Bolun
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1223 - 1228
  • [2] Rectified Exponential Units for Convolutional Neural Networks
    Ying, Yao
    Su, Jianlin
    Shan, Peng
    Miao, Ligang
    Wang, Xiaolian
    Peng, Silong
    [J]. IEEE ACCESS, 2019, 7 : 101633 - 101640
  • [3] Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units
    Xu, Yixi
    Wang, Xiao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [4] Exploring Normalization in Deep Residual Networks with Concatenated Rectified Linear Units
    Shang, Wenling
    Chiu, Justin
    Sohn, Kihyuk
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 1509 - 1516
  • [5] IMPROVING DEEP NEURAL NETWORKS FOR LVCSR USING RECTIFIED LINEAR UNITS AND DROPOUT
    Dahl, George E.
    Sainath, Tara N.
    Hinton, Geoffrey E.
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8609 - 8613
  • [6] Spam Filtering Using Regularized Neural Networks with Rectified Linear Units
    Barushka, Aliaksandr
    Hajek, Petr
    [J]. AI*IA 2016: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2016, 10037 : 65 - 75
  • [7] Deep neural networks with Elastic Rectified Linear Units for object recognition
    Jiang, Xiaoheng
    Pang, Yanwei
    Li, Xuelong
    Pan, Jing
    Xie, Yinghong
    [J]. NEUROCOMPUTING, 2018, 275 : 1132 - 1139
  • [8] Hyperbolic Linear Units for Deep Convolutional Neural Networks
    Li, Jia
    Xu, Hua
    Deng, Junhui
    Sun, Xiaomin
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 353 - 359
  • [9] Elastic exponential linear units for convolutional neural networks
    Kim, Daeho
    Kim, Jinah
    Kim, Jaeil
    [J]. NEUROCOMPUTING, 2020, 406 : 253 - 266
  • [10] Improving deep convolutional neural networks with mixed maxout units
    Zhao, Hui-zhen
    Liu, Fu-xian
    Li, Long-yue
    [J]. PLOS ONE, 2017, 12 (07):