ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

被引:36
|
作者
Gao, Hongyang [1 ]
Wang, Zhengyang [1 ]
Cai, Lei [2 ]
Ji, Shuiwang [1 ]
机构
[1] Texas A&M Univ, Dept Comp Sci & Engn, College Stn, TX 77843 USA
[2] Washington State Univ, Sch Elect Engn & Comp Sci, Pullman, WA 99164 USA
基金
美国国家科学基金会;
关键词
Convolutional codes; Image coding; Computational modeling; Kernel; Computational efficiency; Mobile handsets; Computer architecture; Deep learning; group convolution; channel-wise convolution; convolutional classification; model compression;
D O I
10.1109/TPAMI.2020.2975796
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional neural networks (CNNs) have shown great capability of solving various artificial intelligence tasks. However, the increasing model size has raised challenges in employing them in resource-limited applications. In this work, we propose to compress deep models by using channel-wise convolutions, which replace dense connections among feature maps with sparse ones in CNNs. Based on this novel operation, we build light-weight CNNs known as ChannelNets. ChannelNets use three instances of channel-wise convolutions; namely group channel-wise convolutions, depth-wise separable channel-wise convolutions, and the convolutional classification layer. Compared to prior CNNs designed for mobile devices, ChannelNets achieve a significant reduction in terms of the number of parameters and computational cost without loss in accuracy. Notably, our work represents an attempt to compress the fully-connected classification layer, which usually accounts for about 25 percent of total parameters in compact CNNs. Along this new direction, we investigate the behavior of our proposed convolutional classification layer and conduct detailed analysis. Based on our in-depth analysis, we further propose convolutional classification layers without weight-sharing. This new classification layer achieves a good trade-off between fully-connected classification layers and the convolutional classification layer. Experimental results on the ImageNet dataset demonstrate that ChannelNets achieve consistently better performance compared to prior methods.
引用
收藏
页码:2570 / 2581
页数:12
相关论文
共 50 条
  • [1] ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions
    Gao, Hongyang
    Wang, Zhengyang
    Ji, Shuiwang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [2] Weighted Channel-Wise Decomposed Convolutional Neural Networks
    Yao Lu
    Guangming Lu
    Yuanrong Xu
    Neural Processing Letters, 2019, 50 : 531 - 548
  • [3] Weighted Channel-Wise Decomposed Convolutional Neural Networks
    Lu, Yao
    Lu, Guangming
    Xu, Yuanrong
    NEURAL PROCESSING LETTERS, 2019, 50 (01) : 531 - 548
  • [4] Learning Channel-Wise Interactions for Binary Convolutional Neural Networks
    Wang, Ziwei
    Lu, Jiwen
    Tao, Chenxin
    Zhou, Jie
    Tian, Qi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2021, 43 (10) : 3432 - 3445
  • [5] Learning Channel-wise Interactions for Binary Convolutional Neural Networks
    Wang, Ziwei
    Lu, Jiwen
    Tao, Chenxin
    Zhou, Jie
    Tian, Qi
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 568 - 577
  • [6] CHANNEL REDUNDANCY AND OVERLAP IN CONVOLUTIONAL NEURAL NETWORKS WITH CHANNEL-WISE NNK GRAPHS
    Bonet, David
    Ortega, Antonio
    Ruiz-Hidalgo, Javier
    Shekkizhar, Sarath
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4328 - 4332
  • [7] CHIP: Channel-Wise Disentangled Interpretation of Deep Convolutional Neural Networks
    Cui, Xinrui
    Wang, Dan
    Wang, Z. Jane
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (10) : 4143 - 4156
  • [8] WEAKLY LABELLED AUDIO TAGGING VIA CONVOLUTIONAL NETWORKS WITH SPATIAL AND CHANNEL-WISE ATTENTION
    Hong, Sixin
    Zou, Yuexian
    Wang, Wenwu
    Cao, Meng
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 296 - 300
  • [9] Depth-wise Decomposition for Accelerating Separable Convolutions in Efficient Convolutional Neural Networks
    He, Yihui
    Qian, Jianing
    Le, Cindy X.
    Hetang, Congrui
    Lyu, Qi
    Wang, Wenping
    Yue, Tianwei
    ADVANCES IN ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING, 2023, 3 (04): : 1699 - 1719
  • [10] RETRACTED: Channel-Wise Correlation Calibrates Attention Module for Convolutional Neural Networks (Retracted Article)
    Lu, Ziqiang
    Dong, Yanwu
    Li, Jie
    Lu, Ziying
    He, Pengjie
    Ru, Haibo
    JOURNAL OF SENSORS, 2022, 2022