Filter Contribution Recycle: Boosting Model Pruning with Small Norm Filters

被引:1
|
作者
Chen, Zehong [1 ,2 ,3 ]
Xie, Zhonghua [1 ]
Wang, Zhen [1 ]
Xu, Tao [1 ]
Zhang, Zhengrui [1 ]
机构
[1] Huizhou Univ, Sch Comp Sci & Engn, Huizhou 516007, Peoples R China
[2] Guangdong Key Lab Intelligent Informat Proc, Shenzhen 518060, Peoples R China
[3] Shenzhen Key Lab Media Secur, Shenzhen 518060, Peoples R China
关键词
Model pruning; Structured pruning; Filter contribution recycle; Filter weight reutilization; NETWORKS;
D O I
10.3837/tiis.2022.11.003
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Model pruning methods have attracted huge attention owing to the increasing demand of deploying models on low-resource devices recently. Most existing methods use the weight norm of filters to represent their importance, and discard the ones with small value directly to achieve the pruning target, which ignores the contribution of the small norm filters. This is not only results in filter contribution waste, but also gives comparable performance to training with the random initialized weights [1]. In this paper, we point out that the small norm filters can harm the performance of the pruned model greatly, if they are discarded directly. Therefore, we propose a novel filter contribution recycle (FCR) method for structured model pruning to resolve the fore-mentioned problem. FCR collects and reassembles contribution from the small norm filters to obtain a mixed contribution collector, and then assigns the reassembled contribution to other filters with higher probability to be preserved. To achieve the target FLOPs, FCR also adopts a weight decay strategy for the small norm filters. To explore the effectiveness of our approach, extensive experiments are conducted on ImageNet2012 and CIFAR-10 datasets, and superior results are reported when comparing with other methods under the same or even more FLOPs reduction. In addition, our method is flexible to be combined with other different pruning criterions.
引用
收藏
页码:3507 / 3522
页数:16
相关论文
共 50 条
  • [1] Filter Pruning with Convolutional Approximation Small Model Framework
    Intraraprasit, Monthon
    Chitsobhuk, Orachat
    [J]. COMPUTATION, 2023, 11 (09)
  • [2] Pruning of Network Filters for Small Dataset
    Li, Zhuang
    Xu, Lihong
    Zhu, Shuwei
    [J]. IEEE ACCESS, 2020, 8 : 4522 - 4533
  • [3] Pruning Filters Base on Extending Filter Group Lasso
    Xie, Zhihong
    Li, Ping
    Li, Fei
    Guo, Changyi
    [J]. IEEE ACCESS, 2020, 8 : 217867 - 217876
  • [4] Filter pruning - deeper layers need fewer filters
    Wang, Heng
    Ye, Xiang
    Li, Yong
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (03) : 1977 - 1990
  • [5] Norm-Correlation based filter pruning to accelerating networks
    Hong, Minsoo
    Kim, Sungjei
    Jeong, Jinwoo
    [J]. 12TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE (ICTC 2021): BEYOND THE PANDEMIC ERA WITH ICT CONVERGENCE INNOVATION, 2021, : 1393 - 1396
  • [6] COLOR CONSTANCY MODEL OPTIMIZATION WITH SMALL DATASET VIA PRUNING OF CNN FILTERS
    Husseini, Sahar
    Babahajiani, Pouria
    Gabbouj, Moncef
    [J]. PROCEEDINGS OF THE 2021 9TH EUROPEAN WORKSHOP ON VISUAL INFORMATION PROCESSING (EUVIP), 2021,
  • [7] Pruning filters with L1-norm and capped L1-norm for CNN compression
    Aakash Kumar
    Ali Muhammad Shaikh
    Yun Li
    Hazrat Bilal
    Baoqun Yin
    [J]. Applied Intelligence, 2021, 51 : 1152 - 1160
  • [8] Pruning filters with L1-norm and capped L1-norm for CNN compression
    Kumar, Aakash
    Shaikh, Ali Muhammad
    Li, Yun
    Bilal, Hazrat
    Yin, Baoqun
    [J]. APPLIED INTELLIGENCE, 2021, 51 (02) : 1152 - 1160
  • [9] Pruning Filters With L1-norm And Standard Deviation for CNN Compression
    Sun, Xinlu
    Zhou, Dianle
    Pan, Xiaotian
    Zhong, Zhiwei
    Wang, Fei
    [J]. ELEVENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2018), 2019, 11041
  • [10] Model pruning based on filter similarity for edge device deployment
    Wu, Tingting
    Song, Chunhe
    Zeng, Peng
    [J]. FRONTIERS IN NEUROROBOTICS, 2023, 17