Compression of Deep Convolutional Neural Networks Using Effective Channel Pruning

被引:1
|
作者
Guo, Qingbei [1 ,2 ]
Wu, Xiao-Jun [1 ]
Zhao, Xiuyang [2 ]
机构
[1] Jiangnan Univ, Jiangsu Prov Engn Lab Pattern Recognit & Computat, Wuxi 214122, Jiangsu, Peoples R China
[2] Univ Jinan, Shandong Prov Key Lab Network Based Intelligent C, Jinan 250022, Peoples R China
来源
基金
国家重点研发计划; 中国国家自然科学基金; 英国工程与自然科学研究理事会;
关键词
Deep neural network; Classification; Pruning;
D O I
10.1007/978-3-030-34120-6_62
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Pruning is a promising technology for convolutional neural networks (CNNs) to address the problems of high computational complexity and high memory requirement. However, there is a principal challenge in channel pruning. Although the least important feature-map is removed each time based on one pruning criterion, these pruning may produce considerable fluctuation in classification performance, which easily results in failing to restore its capacity. We propose an effective channel pruning criterion to reduce redundant parameters, while significantly reducing such fluctuations. This criterion adopts the loss-approximating Taylor expansion based on not the pruned parameters but the parameters in the subsequent convolutional layer, which differentiates our method from existing methods, to evaluate the importance of each channel. To improve the learning effectivity and efficiency, the importance of these channels is ranked using a small proportion of training dataset. Furthermore, after each least important channel is pruned, a small fraction of training dataset is used to fine-tune the pruned network to partially recover its accuracy. Periodically, more proportion of training dataset is used for the intensive recovery in accuracy. The proposed criterion significantly addresses the aforementioned problems and shows outstanding performance compared to other criteria, such as Random, APoZ and Taylor pruning criteria. The experimental results demonstrate the excellent compactness performances of our approach, using several public image classification datasets, on some popular deep network architectures. Our code is available at: https://github.com/QingbeiGuo/BasedTaylor-Pruning.git.
引用
收藏
页码:760 / 772
页数:13
相关论文
共 50 条
  • [1] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    [J]. ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [2] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    [J]. 2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [3] RepSGD: Channel Pruning Using Reparamerization for Accelerating Convolutional Neural Networks
    Kim, Nam Joon
    Kim, Hyun
    [J]. 2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [4] Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning
    Singh, Pravendra
    Verma, Vinay Kumar
    Rai, Piyush
    Namboodiri, Vinay P.
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 838 - 847
  • [5] Studying the plasticity in deep convolutional neural networks using random pruning
    Mittal, Deepak
    Bhardwaj, Shweta
    Khapra, Mitesh M.
    Ravindran, Balaraman
    [J]. MACHINE VISION AND APPLICATIONS, 2019, 30 (02) : 203 - 216
  • [6] Studying the plasticity in deep convolutional neural networks using random pruning
    Deepak Mittal
    Shweta Bhardwaj
    Mitesh M. Khapra
    Balaraman Ravindran
    [J]. Machine Vision and Applications, 2019, 30 : 203 - 216
  • [7] Accelerating Convolutional Neural Networks with Dynamic Channel Pruning
    Zhang, Chiliang
    Hu, Tao
    Guan, Yingda
    Ye, Zuochang
    [J]. 2019 DATA COMPRESSION CONFERENCE (DCC), 2019, : 563 - 563
  • [8] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [9] Iris Image Compression Using Deep Convolutional Neural Networks
    Jalilian, Ehsaneddin
    Hofbauer, Heinz
    Uhl, Andreas
    [J]. SENSORS, 2022, 22 (07)
  • [10] Automatic Compression Ratio Allocation for Pruning Convolutional Neural Networks
    Liu, Yunfeng
    Kong, Huihui
    Yu, Peihua
    [J]. ICVISP 2019: PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING, 2019,