Batch-Normalization-based Soft Filter Pruning for Deep Convolutional Neural Networks

被引:0
|
作者
Xu, Xiaozhou [1 ]
Chen, Qiming [1 ]
Xie, Lei [1 ]
Su, Hongye [1 ]
机构
[1] Zhejiang Univ, Coll Control Sci & Engn, State Key Lab Ind Control Technol, Hangzhou, Peoples R China
关键词
object classification; model compression; network pruning;
D O I
10.1109/icarcv50220.2020.9305319
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
As convolutional neural network contains many redundant parameters, a lot of methods have been developed to compress the network for accelerating inference. Among these, network pruning, which is a kind of widely used approaches, can effectively decrease the memory capacity and reduce the computation cost. Herein, we propose a competitive pruning approach based on Soft Filter Pruning (SFP) by taking account of the scaling factors. of Batch Normalization (BN) layers as the criterion of filter selection strategy. During the soft pruning procedure, in each epoch only. values of BN layers less than threshold are set to zero instead of setting the weights of selected filters in convolutional layers to zero. Compared to the existing approaches, the proposed method can obtain a highly increased accuracy on image recognition. Notably, on CIFAR-10, the proposed method reduces the same 40.8% FLOPs as SFP on ResNet-110 with even 0.87% top-1 accuracy improvement.
引用
收藏
页码:951 / 956
页数:6
相关论文
共 50 条
  • [1] Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks
    He, Yang
    Dong, Xuanyi
    Kang, Guoliang
    Fu, Yanwei
    Yan, Chenggang
    Yang, Yi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3594 - 3604
  • [2] Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
    He, Yang
    Kang, Guoliang
    Dong, Xuanyi
    Fu, Yanwei
    Yang, Yi
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2234 - 2240
  • [3] An optimal-score-based filter pruning for deep convolutional neural networks
    Sawant, Shrutika S.
    Bauer, J.
    Erick, F. X.
    Ingaleshwar, Subodh
    Holzer, N.
    Ramming, A.
    Lang, E. W.
    Goetz, Th
    [J]. APPLIED INTELLIGENCE, 2022, 52 (15) : 17557 - 17579
  • [4] An optimal-score-based filter pruning for deep convolutional neural networks
    Shrutika S. Sawant
    J. Bauer
    F. X. Erick
    Subodh Ingaleshwar
    N. Holzer
    A. Ramming
    E. W. Lang
    Th. Götz
    [J]. Applied Intelligence, 2022, 52 : 17557 - 17579
  • [5] Soft Taylor Pruning for Accelerating Deep Convolutional Neural Networks
    Rong, Jintao
    Yu, Xiyi
    Zhang, Mingyang
    Ou, Linlin
    [J]. IECON 2020: THE 46TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2020, : 5343 - 5349
  • [6] FRACTIONAL STEP DISCRIMINANT PRUNING: A FILTER PRUNING FRAMEWORK FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (ICMEW), 2020,
  • [7] Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning
    Singh, Pravendra
    Verma, Vinay Kumar
    Rai, Piyush
    Namboodiri, Vinay P.
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 838 - 847
  • [8] Filter Pruning for Efficient Transfer Learning in Deep Convolutional Neural Networks
    Reinhold, Caique
    Roisenberg, Mauro
    [J]. ARTIFICIAL INTELLIGENCEAND SOFT COMPUTING, PT I, 2019, 11508 : 191 - 202
  • [9] Learning Filter Pruning Criteria for Deep Convolutional Neural Networks Acceleration
    He, Yang
    Ding, Yuhang
    Liu, Ping
    Zhu, Linchao
    Zhang, Hanwang
    Yang, Yi
    [J]. 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 2006 - 2015
  • [10] A Filter Rank Based Pruning Method for Convolutional Neural Networks
    Liu, Hao
    Guan, Zhenyu
    Lei, Peng
    [J]. 2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1318 - 1322