ONLINE FILTER CLUSTERING AND PRUNING FOR EFFICIENT CONVNETS

被引:0
|
作者
Zhou, Zhengguang [1 ]
Zhou, Wengang [1 ]
Hong, Richang [2 ]
Li, Houqiang [1 ]
机构
[1] Univ Sci & Technol China, EEIS Dept, CAS Key Lab Technol Geospatial Informat Proc & Ap, Hefei, Anhui, Peoples R China
[2] HeFei Univ Technol, Hefei, Anhui, Peoples R China
关键词
Deep neural networks; similar filter; filter pruning; cluster loss;
D O I
暂无
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Pruning filters is an effective method for accelerating deep neural networks (DNNs), but most existing approaches prune filters on a pre-trained network directly which limits in acceleration. Although each filter has its own effect in DNNs, but if two filters are same with each other, we could prune one safely. In this paper, we add an extra cluster loss term in the loss function which can force filters in each cluster to be similar online. After training, we keep one filter in each cluster and prune others and fine-tune the pruned network to compensate the loss. Particularly, the clusters in every layer can be defined firstly which is effective for pruning DNNs within residual blocks. Extensive experiments on CIFAR10 and CIFAR100 benchmarks demonstrate the competitive performance of our proposed filter pruning method.
引用
收藏
页码:11 / 15
页数:5
相关论文
共 50 条
  • [1] ONLINE FILTER WEAKENING AND PRUNING FOR EFFICIENT CONVNETS
    Zhou, Zhengguang
    Zhou, Wengang
    Hong, Richang
    Li, Houqiang
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2018,
  • [2] Pruning ConvNets Online for Efficient Specialist Models
    Guo, Jia
    Potkonjak, Miodrag
    [J]. 2017 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), 2017, : 430 - 437
  • [3] PCA driven mixed filter pruning for efficient convNets
    Ahmed, Waqas
    Ansari, Shahab
    Hanif, Muhammad
    Khalil, Akhtar
    [J]. PLOS ONE, 2022, 17 (01):
  • [4] FALF ConvNets: Fatuous auxiliary loss based filter-pruning for efficient deep CNNs
    Singh, Pravendra
    Kadi, Vinay Sameer Raja
    Namboodiri, Vinay P.
    [J]. IMAGE AND VISION COMPUTING, 2020, 93
  • [5] Structured Pruning for Efficient ConvNets via Incremental Regularization
    Wang, Huan
    Zhang, Qiming
    Wang, Yuehai
    Yu, Lu
    Hu, Haoji
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [6] Learning compact ConvNets through filter pruning based on the saliency of a feature map
    Liu, Zhoufeng
    Liu, Xiaohui
    Li, Chunlei
    Ding, Shumin
    Liao, Liang
    [J]. IET IMAGE PROCESSING, 2022, 16 (01) : 123 - 133
  • [7] Toward Compact ConvNets via Structure-Sparsity Regularized Filter Pruning
    Lin, Shaohui
    Ji, Rongrong
    Li, Yuchao
    Deng, Cheng
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (02) : 574 - 588
  • [8] A Novel Clustering-Based Filter Pruning Method for Efficient Deep Neural Networks
    Wei, Xiaohui
    Shen, Xiaoxian
    Zhou, Changbao
    Yue, Hengshan
    [J]. ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2020, PT II, 2020, 12453 : 245 - 258
  • [9] Filter pruning via feature map clustering
    Li, Wei
    He, Yongxing
    Zhang, Xiaoyu
    Tang, Yongchuan
    [J]. INTELLIGENT DATA ANALYSIS, 2023, 27 (04) : 911 - 933
  • [10] ONE-CYCLE PRUNING: PRUNING CONVNETS WITH TIGHT TRAINING BUDGET
    Hubens, Nathan
    Mancas, Matei
    Gosselin, Bernard
    Preda, Marius
    Zaharia, Titus
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 4128 - 4132