Entropy Induced Pruning Framework for Convolutional Neural Networks

被引:0
|
作者
Lu, Yiheng [1 ]
Guan, Ziyu [1 ]
Yang, Yaming [1 ]
Zhao, Wei [1 ]
Gong, Maoguo [1 ]
Xu, Cai [1 ]
机构
[1] Xidian Univ, Key Lab Collaborat Intelligence Syst, Minist Educ, Xian, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Structured pruning techniques have achieved great compression performance on convolutional neural networks for image classification tasks. However, the majority of existing methods are sensitive with respect to the model parameters, and their pruning results may be unsatisfactory when the original model is trained poorly. That is, they need the original model to be fully trained, to obtain useful weight information. This is time-consuming, and makes the effectiveness of the pruning results dependent on the degree of model optimization. To address the above issue, we propose a novel metric named Average Filter Information Entropy (AFIE). It decomposes the weight matrix of each layer into a low-rank space, and quantifies the filter importance based on the distribution of the normalized eigenvalues. Intuitively, the eigenvalues capture the covariance among filters, and therefore could be a good guide for pruning. Since the distribution of eigenvalues is robust to the updating of parameters, AFIE can yield a stable evaluation for the importance of each filter no matter whether the original model is trained fully. We implement our AFIE-based pruning method for three popular CNN models of AlexNet, VGG-16, and ResNet-50, and test them on three widely-used image datasets MNIST, CIFAR-10, and ImageNet, respectively. The experimental results are encouraging. We surprisingly observe that for our methods, even when the original model is trained with only one epoch, the AFIE score of each filter keeps identical to the results when the model is fully-trained. This fully indicates the effectiveness of the proposed pruning method.
引用
收藏
页码:3918 / 3926
页数:9
相关论文
共 50 条
  • [1] Cross-Entropy Pruning for Compressing Convolutional Neural Networks
    Bao, Rongxin
    Yuan, Xu
    Chen, Zhikui
    Ma, Ruixin
    [J]. NEURAL COMPUTATION, 2018, 30 (11) : 3128 - 3149
  • [2] Entropy-based pruning method for convolutional neural networks
    Hur, Cheonghwan
    Kang, Sanggil
    [J]. JOURNAL OF SUPERCOMPUTING, 2019, 75 (06): : 2950 - 2963
  • [3] Entropy-based pruning method for convolutional neural networks
    Cheonghwan Hur
    Sanggil Kang
    [J]. The Journal of Supercomputing, 2019, 75 : 2950 - 2963
  • [4] FRACTIONAL STEP DISCRIMINANT PRUNING: A FILTER PRUNING FRAMEWORK FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (ICMEW), 2020,
  • [5] Iterative clustering pruning for convolutional neural networks
    Chang, Jingfei
    Lu, Yang
    Xue, Ping
    Xu, Yiqun
    Wei, Zhen
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 265
  • [6] Leveraging Structured Pruning of Convolutional Neural Networks
    Tessier, Hugo
    Gripon, Vincent
    Leonardon, Mathieu
    Arzel, Matthieu
    Bertrand, David
    Hannagan, Thomas
    [J]. 2022 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2022, : 174 - 179
  • [7] Flattening Layer Pruning in Convolutional Neural Networks
    Jeczmionek, Ernest
    Kowalski, Piotr A.
    [J]. SYMMETRY-BASEL, 2021, 13 (07):
  • [8] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    [J]. ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [9] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    [J]. 2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [10] Blending Pruning Criteria for Convolutional Neural Networks
    He, Wei
    Huang, Zhongzhan
    Liang, Mingfu
    Liang, Senwei
    Yang, Haizhao
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 3 - 15