Pruning CNN's with Linear Filter Ensembles

被引:3
|
作者
Sandor, Csanad [1 ,2 ]
Pavel, Szabolcs [1 ,2 ]
Csato, Lehel [1 ]
机构
[1] Babes Bolyai Univ, Cluj Napoca, Romania
[2] Robert Bosch SRL, Cluj Napoca, Romania
来源
ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE | 2020年 / 325卷
关键词
D O I
10.3233/FAIA200249
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the promising results of convolutional neural networks (CNNs), their application on devices with limited resources is still a big challenge; this is mainly due to the huge memory and computation requirements of the CNN. To counter the limitation imposed by the network size, we use pruning to reduce the network size and - implicitly - the number of floating point operations (FLOPs). Contrary to the filter norm method - used in "conventional" network pruning - based on the assumption that a smaller norm implies "less importance" to its associated component, we develop a novel filter importance norm that is based on the change in the empirical loss caused by the presence or removal of a component from the network architecture. Since there are too many individual possibilities for filter configuration, we repeatedly sample from these architectural components and measure the system performance in the respective state of components being active or disabled. The result is a collection of filter ensembles - filter masks - and associated performance values. We rank the filters based on a linear and additive model and remove the least important ones such that the drop in network accuracy is minimal. We evaluate our method on a fully connected network, as well as on the ResNet architecture trained on the CIFAR-10 dataset. Using our pruning method, we managed to remove 60% of the parameters and 64% of the FLOPs from the ResNet with an accuracy drop of less than 0.6%.
引用
收藏
页码:1435 / 1442
页数:8
相关论文
共 50 条
  • [31] Collective-agreement-based pruning of ensembles
    Rokach, Lior
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2009, 53 (04) : 1015 - 1026
  • [32] Method for pruning Bagging ensembles and its applications
    School of Economics and Finance, Xi'an Jiaotong University, Xi'an 710049, China
    不详
    不详
    Xitong Gongcheng Lilum yu Shijian, 2008, 7 (105-110):
  • [33] A pruning algorithm for training neural network ensembles
    Shahjahan, M
    Akhand, MAH
    Murase, K
    SICE 2003 ANNUAL CONFERENCE, VOLS 1-3, 2003, : 628 - 633
  • [34] ResPrune: An energy-efficient restorative filter pruning method using stochastic optimization for accelerating CNN
    Jayasimhan, Anusha
    Pabitha, P.
    PATTERN RECOGNITION, 2024, 155
  • [35] Despeckling CNN with Ensembles of Classical Outputs
    Mishra, Deepak
    Tyagi, Sarthak
    Chaudhury, Santanu
    Sarkar, Mukul
    Soin, Arvinder Singh
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 3802 - 3807
  • [36] A Study of Filter Duplication for CNNs Filter Pruning
    Ikuta, Ryosuke
    Yata, Noriko
    Manabe, Yoshitsugu
    INTERNATIONAL WORKSHOP ON ADVANCED IMAGING TECHNOLOGY, IWAIT 2024, 2024, 13164
  • [37] Shift Pruning: Equivalent Weight Pruning for CNN via Differentiable Shift Operator
    Niu, Tao
    Lou, Yihang
    Teng, Yinglei
    He, Jianzhong
    Liu, Yiding
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5445 - 5454
  • [38] Filter Sketch for Network Pruning
    Lin, Mingbao
    Cao, Liujuan
    Li, Shaojie
    Ye, Qixiang
    Tian, Yonghong
    Liu, Jianzhuang
    Tian, Qi
    Ji, Rongrong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (12) : 7091 - 7100
  • [39] A pruning algorithm for training cooperative neural network ensembles
    Shahjahan, M
    Murase, K
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2006, E89D (03): : 1257 - 1269
  • [40] Energy-Based Clustering for Pruning Heterogeneous Ensembles
    Cela, Javier
    Suarez, Alberto
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I, 2018, 11139 : 346 - 351