Channel pruning guided by global channel relation

被引:0
|
作者
Yingjie Cheng
Xiaoqi Wang
Xiaolan Xie
Wentao Li
Shaoliang Peng
机构
[1] Hunan University,College of Computer Science and Electronic Engineering
[2] Guilin University of Technology,College of Information Science and Engineering
[3] National University of Defense Technology,School of Computer Science
来源
Applied Intelligence | 2022年 / 52卷
关键词
Channel pruning; Global channel relation; SE block; Sparsity;
D O I
暂无
中图分类号
学科分类号
摘要
Channel pruning approaches have achieved some success in compressing deep convolutional neural networks (CNNs). However, existing methods ignore the interdependence among channels within the same layer. In this paper, we propose a novel channel pruning approach called Channel Pruning guided by Global Channel Relation (CPGCR), which additionally takes the global channel relation into account in the channel pruning process. Considering that Squeeze-and-Excitation (SE) blocks have the ability to encode the channel relation, our method is mainly used to compress the CNNs with SE blocks. We also observe that SE blocks will enforce channel-level sparsity in the network, which is useful for the implementation of channel pruning algorithms. Extensive experiments with a variety of neural networks on five datasets clearly demonstrate the effectiveness of our proposed CPGCR method. The results show that on ImageNet, our method gives a 56% reductions in float-point-operation (FLOPs) for ResNet-50. On CIFAR-10, the CNNs compressed by CPGCR achieve comparable accuracy to that of the original models, but with significant reductions in FLOPs (61.7% for ResNet-56, 78.0% for VGG-16).
引用
收藏
页码:16202 / 16213
页数:11
相关论文
共 50 条
  • [41] Dynamical Conventional Neural Network Channel Pruning by Genetic Wavelet Channel Search for Image Classification
    Chen, Lin
    Gong, Saijun
    Shi, Xiaoyu
    Shang, Mingsheng
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2021, 15
  • [42] Channel pruning based on convolutional neural network sensitivity
    Yang, Chenbin
    Liu, Huiyi
    NEUROCOMPUTING, 2022, 507 : 97 - 106
  • [43] Neural network pruning based on channel attention mechanism
    Hu, Jianqiang
    Liu, Yang
    Wu, Keshou
    CONNECTION SCIENCE, 2022, 34 (01) : 2201 - 2218
  • [44] Carrying Out CNN Channel Pruning in a White Box
    Zhang, Yuxin
    Lin, Mingbao
    Lin, Chia-Wen
    Chen, Jie
    Wu, Yongjian
    Tian, Yonghong
    Ji, Rongrong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (10) : 7946 - 7955
  • [45] Gator: Customizable Channel Pruning of Neural Networks with Gating
    Passov, Eli
    David, Eli O.
    Netanyahu, Nathan S.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 46 - 58
  • [46] Revisiting Random Channel Pruning for Neural Network Compression
    Li, Yawei
    Adamczewski, Kamil
    Li, Wen
    Gu, Shuhang
    Timofte, Radu
    Van Gool, Luc
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 191 - 201
  • [47] ACP: ADAPTIVE CHANNEL PRUNING FOR EFFICIENT NEURAL NETWORKS
    Zhang, Yuan
    Yuan, Yuan
    Wang, Qi
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4488 - 4492
  • [48] Accelerating Convolutional Neural Networks with Dynamic Channel Pruning
    Zhang, Chiliang
    Hu, Tao
    Guan, Yingda
    Ye, Zuochang
    2019 DATA COMPRESSION CONFERENCE (DCC), 2019, : 563 - 563
  • [49] DMCP: Differentiable Markov Channel Pruning for Neural Networks
    Guo, Shaopeng
    Wang, Yujie
    Li, Quanquan
    Yan, Junjie
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 1536 - 1544
  • [50] CHANNEL PRUNING VIA ATTENTION MODULE AND MEMORY CURVE
    Li, Hufei
    Cao, Jian
    Liu, Xiangcheng
    Chen, Jue
    Shang, Jingjie
    Qian, Yu
    Wang, Yuan
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 1985 - 1989