Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units

被引:0
|
作者
Shang, Wenling [1 ,4 ]
Sohn, Kihyuk [2 ]
Almeida, Diogo [3 ]
Lee, Honglak [1 ]
机构
[1] Univ Michigan, Ann Arbor, MI 48109 USA
[2] NEC Labs Amer, Irving, TX USA
[3] Enlitic, San Francisco, CA USA
[4] Oculus VR, Menlo Pk, CA 94025 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, convolutional neural networks (CNNs) have been used as a powerful tool to solve many problems of machine learning and computer vision. In this paper, we aim to provide insight on the property of convolutional neural networks, as well as a generic method to improve the performance of many CNN architectures. Specifically, we first examine existing CNN models and observe an intriguing property that the filters in the lower layers form pairs (i.e., filters with opposite phase). Inspired by our observation, we propose a novel, simple yet effective activation scheme called concatenated ReLU (CReLU) and theoretically analyze its reconstruction property in CNNs. We integrate CReLU into several state-of-the-art CNN architectures and demonstrate improvement in their recognition performance on CIFAR-10/100 and ImageNet datasets with fewer trainable parameters. Our results suggest that better understanding of the properties of CNNs can lead to significant performance improvement with a simple modification.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Sound Event Localization and Detection Using Convolutional Recurrent Neural Networks and Gated Linear Units
    Komatsu, Tatsuya
    Togami, Masahito
    Takahashi, Tsubasa
    [J]. 28TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2020), 2021, : 41 - 45
  • [32] Graph-adaptive Rectified Linear Unit for Graph Neural Networks
    Zhang, Yifei
    Zhu, Hao
    Meng, Ziqiao
    Koniusz, Piotr
    King, Irwin
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 1331 - 1339
  • [33] Memory Capacity of Neural Networks with Threshold and Rectified Linear Unit Activations
    Vershynin, Roman
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (04): : 1004 - 1033
  • [34] Breast Cancer Classification Using Concatenated Triple Convolutional Neural Networks Model
    Alshayeji, Mohammad H.
    Al-Buloushi, Jassim
    [J]. BIG DATA AND COGNITIVE COMPUTING, 2023, 7 (03)
  • [35] Facial Action Units for Training Convolutional Neural Networks
    Trinh Thi Doan Pham
    Won, Chee Sun
    [J]. IEEE ACCESS, 2019, 7 : 77816 - 77824
  • [36] Fruit category classification via an eight-layer convolutional neural network with parametric rectified linear unit and dropout technique
    Wang, Shui-Hua
    Chen, Yi
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (21-22) : 15117 - 15133
  • [37] Towards understanding residual and dilated dense neural networks via convolutional sparse coding
    Zhiyang Zhang
    Shihua Zhang
    [J]. National Science Review, 2021, 8 (03) : 127 - 139
  • [38] Linear Computation Coding for Convolutional Neural Networks
    Mueller, Ralf R.
    Rosenberger, Hans
    Reichenbach, Marc
    [J]. 2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 562 - 565
  • [39] Towards understanding residual and dilated dense neural networks via convolutional sparse coding
    Zhang, Zhiyang
    Zhang, Shihua
    [J]. NATIONAL SCIENCE REVIEW, 2021, 8 (03)
  • [40] Fruit category classification via an eight-layer convolutional neural network with parametric rectified linear unit and dropout technique
    Shui-Hua Wang
    Yi Chen
    [J]. Multimedia Tools and Applications, 2020, 79 : 15117 - 15133