Filter Pruning via Probabilistic Model-based Optimization for Accelerating Deep Convolutional Neural Networks

被引:2
|
作者
Li, Qinghua [1 ]
Li, Cuiping [1 ]
Chen, Hong [1 ]
机构
[1] Renmin Univ China, Informat Sch, Beijing, Peoples R China
基金
国家重点研发计划;
关键词
Deep Learning; Pruning Models; Accelerating Deep CNNs;
D O I
10.1145/3437963.3441766
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accelerating Deep Convolutional Neural Networks(CNNs) has recently received ever-increasing research focus. Among various approaches proposed in the literature, filter pruning has been regarded as a promising solution, which is due to its advantage in significant speedup and memory reduction of both network model and intermediate feature maps. Previous works utilized "smaller-norm-less-important" criterion to prune filters with smaller l(p)-norm values by pruning and retraining alternately. However, they ignore the effects of feedback: most current approaches that prune filters only consider the statistics of the filters (e.g., prune filter with small l(p)-norm values), without considering the performance of the pruned model as an important feedback signal in the next iteration of filter pruning. To solve the problem of non-feedback, we propose a novel filter pruning method, namely Filter Pruning via Probabilistic Model-based Optimization (FPPMO). FPPMO solves the problem of non-feedback by pruning filters in a probabilistic manner. We introduce a pruning probability for each filter, and pruning is guided by sampling from the pruning probability distribution. An optimization method is proposed to update the pruning probability based on the performance of the pruned model in the pruning process. When applied to two image classification benchmarks, the effectiveness of our FPPMO is validated. Notably, on CIFAR-10, our FPPMO reduces more than 57% FLOPs on ResNet-110 with even 0.08% relative accuracy improvement. Moreover, on ILSVRC-2012, our FPPMO reduces more than 50% FLOPs on ResNet-101 without top-5 accuracy drop. Which proving that our FPPMO outperforms the state-of-the-art filter pruning method.
引用
收藏
页码:653 / 661
页数:9
相关论文
共 50 条
  • [1] Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
    He, Yang
    Kang, Guoliang
    Dong, Xuanyi
    Fu, Yanwei
    Yang, Yi
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2234 - 2240
  • [2] Incremental Filter Pruning via Random Walk for Accelerating Deep Convolutional Neural Networks
    Li, Qinghua
    Li, Cuiping
    Chen, Hong
    [J]. PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 358 - 366
  • [3] Gate Decorator: Global Filter Pruning Method for Accelerating Deep Convolutional Neural Networks
    You, Zhonghui
    Yan, Kun
    Ye, Jinmian
    Ma, Meng
    Wang, Ping
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Accelerating Convolutional Networks via Global & Dynamic Filter Pruning
    Lin, Shaohui
    Ji, Rongrong
    Li, Yuchao
    Wu, Yongjian
    Huang, Feiyue
    Zhang, Baochang
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2425 - 2432
  • [5] Soft Taylor Pruning for Accelerating Deep Convolutional Neural Networks
    Rong, Jintao
    Yu, Xiyi
    Zhang, Mingyang
    Ou, Linlin
    [J]. IECON 2020: THE 46TH ANNUAL CONFERENCE OF THE IEEE INDUSTRIAL ELECTRONICS SOCIETY, 2020, : 5343 - 5349
  • [6] Filter Pruning via Geometric Median for Deep Convolutional Neural Networks Acceleration
    He, Yang
    Liu, Ping
    Wang, Ziwei
    Hu, Zhilan
    Yang, Yi
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 4335 - 4344
  • [7] An optimal-score-based filter pruning for deep convolutional neural networks
    Shrutika S. Sawant
    J. Bauer
    F. X. Erick
    Subodh Ingaleshwar
    N. Holzer
    A. Ramming
    E. W. Lang
    Th. Götz
    [J]. Applied Intelligence, 2022, 52 : 17557 - 17579
  • [8] An optimal-score-based filter pruning for deep convolutional neural networks
    Sawant, Shrutika S.
    Bauer, J.
    Erick, F. X.
    Ingaleshwar, Subodh
    Holzer, N.
    Ramming, A.
    Lang, E. W.
    Goetz, Th
    [J]. APPLIED INTELLIGENCE, 2022, 52 (15) : 17557 - 17579
  • [9] FP-AGL: Filter Pruning With Adaptive Gradient Learning for Accelerating Deep Convolutional Neural Networks
    Kim, Nam Joon
    Kim, Hyun
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 5279 - 5290
  • [10] Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks
    He, Yang
    Dong, Xuanyi
    Kang, Guoliang
    Fu, Yanwei
    Yan, Chenggang
    Yang, Yi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3594 - 3604