Rethinking the Pruning Criteria for Convolutional Neural Network

被引:0
|
作者
Huang, Zhongzhan [1 ]
Shao, Wenqi [2 ,3 ]
Wang, Xinjiang [3 ]
Lin, Liang [1 ]
Luo, Ping [4 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Guangdong, Peoples R China
[2] Chinese Univ Hong Kong, Hong Kong, Peoples R China
[3] SenseTime Res, Hong Kong, Peoples R China
[4] Univ Hong Kong, Hong Kong, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Channel pruning is a popular technique for compressing convolutional neural networks (CNNs), where various pruning criteria have been proposed to remove the redundant filters. From our comprehensive experiments, we found two blind spots of pruning criteria: (1) Similarity: There are some strong similarities among several primary pruning criteria that are widely cited and compared. According to these criteria, the ranks of filters' Importance Score are almost identical, resulting in similar pruned structures. (2) Applicability: The filters' Importance Score measured by some pruning criteria are too close to distinguish the network redundancy well. In this paper, we analyze the above blind spots on different types of pruning criteria with layer-wise pruning or global pruning. We also break some stereotypes, such as that the results of l(1) and l(2) pruning are not always similar. These analyses are based on the empirical experiments and our assumption (Convolutional Weight Distribution Assumption) that the well-trained convolutional filters in each layer approximately follow a Gaussian-alike distribution. This assumption has been verified through systematic and extensive statistical tests.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Blending Pruning Criteria for Convolutional Neural Networks
    He, Wei
    Huang, Zhongzhan
    Liang, Mingfu
    Liang, Senwei
    Yang, Haizhao
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 3 - 15
  • [2] Variational Convolutional Neural Network Pruning
    Zhao, Chenglong
    Ni, Bingbing
    Zhang, Jian
    Zhao, Qiwei
    Zhang, Wenjun
    Tian, Qi
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2775 - 2784
  • [3] Convolutional Neural Network Pruning: A Survey
    Xu, Sheng
    Huang, Anran
    Chen, Lei
    Zhang, Baochang
    [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7458 - 7463
  • [4] Measurement criteria for neural network pruning
    Erdogan, SS
    Ng, GS
    Patrick, KHC
    [J]. 1996 IEEE TENCON - DIGITAL SIGNAL PROCESSING APPLICATIONS PROCEEDINGS, VOLS 1 AND 2, 1996, : 83 - 89
  • [5] Overview of Deep Convolutional Neural Network Pruning
    Li, Guang
    Liu, Fang
    Xia, Yuping
    [J]. 2020 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING AND ARTIFICIAL INTELLIGENCE, 2020, 11584
  • [6] Pruning Convolutional Neural Network with Distinctiveness Approach
    Li, Wenrui
    Plested, Jo
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 448 - 455
  • [7] Thinning of convolutional neural network with mixed pruning
    Yang, Wenzhu
    Jin, Lilei
    Wang, Sile
    Cu, Zhenchao
    Chen, Xiangyang
    Chen, Liping
    [J]. IET IMAGE PROCESSING, 2019, 13 (05) : 779 - 784
  • [8] ScoringNet: A Neural Network Based Pruning Criteria for Structured Pruning
    Wang, Shuang
    Zhang, Zhaogong
    [J]. Scientific Programming, 2023, 2023
  • [9] Convolutional Neural Network Pruning with Structural Redundancy Reduction
    Wang, Zi
    Li, Chengcheng
    Wang, Xiangyang
    [J]. 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 14908 - 14917
  • [10] Channel pruning based on convolutional neural network sensitivity
    Yang, Chenbin
    Liu, Huiyi
    [J]. NEUROCOMPUTING, 2022, 507 : 97 - 106