SeNPIS: Sequential Network Pruning by class-wise Importance Score

被引:7
|
作者
Pachon, Cesar G. [1 ]
Ballesteros, Dora M. [1 ]
Renza, Diego [1 ]
机构
[1] Univ Mil Nueva Granada, Carrera 11 101-80, Bogota 110111, Colombia
关键词
Deep learning; Model compression; Pruning algorithm; Importance score; Convolutional neural network;
D O I
10.1016/j.asoc.2022.109558
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the last decade, pattern recognition and decision making from images has mainly focused on the development of deep learning architectures, with different types of networks such as sequential, residual and parallel. Although the depth and size varies between models, they all have in common that they can contain multiple filters or neurons that are not important for the purpose of prediction, and that do negatively impact the size of the model and their inference times. Therefore, it is advantageous to use pruning methods that, while largely maintaining the initial performance of the classifier, significantly reduce its size and FLOPs. In parameter reduction, the decision rule is generally based on mathematical criteria, e.g. the amplitude of the weights, but not on the actual impact of the filter or neuron on the classifier performance for each of the classes. Therefore, we propose SeNPIS as a method that involves both filter and neuron selection based on a class-wise importance score, and network resizing to increase parameter reduction and FLOPs in sequential CNN networks. Several tests were performed to compare SeNPIS with other representative state-of-the-art methods, for the CIFAR-10 and Scene-15 datasets. It was found that for similar values of accuracy, and even in some cases with a slight increase in accuracy, SeNPIS significantly reduces the number of parameters by up to an additional 23.5% (i.e., a 51.05% reduction with SeNPIS versus a 27.53% reduction with Gradient) and FLOPs by up to an additional 26.6% (i.e., a 74.82% reduction with SeNPIS versus a 48.16% reduction with Weight) compared to the Weight, Taylor, Gradient and LRP methods.(c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] CDP: Towards Optimal Filter Pruning via Class-wise Discriminative Power
    Xu, Tianshuo
    Wu, Yuhang
    Zheng, Xiawu
    Xi, Teng
    Zhang, Gang
    Ding, Errui
    Chao, Fei
    Ji, Rongrong
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 5491 - 5500
  • [2] Class-wise Information Gain
    Zhang, Pengtao
    Tan, Ying
    2013 INTERNATIONAL CONFERENCE ON INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2013, : 972 - 978
  • [3] Deep Class-Wise Hashing: Semantics-Preserving Hashing via Class-Wise Loss
    Zhe, Xuefei
    Chen, Shifeng
    Yan, Hong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (05) : 1681 - 1695
  • [4] Class-wise and reduced calibration methods
    Panchenko, Michael
    Benmerzoug, Anes
    Delgado, Miguel de Benito
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 1093 - 1100
  • [5] CLASS-WISE ADVERSARIAL TRANSFER NETWORK FOR REMOTE SENSING SCENE CLASSIFICATION
    Liu, Zixu
    Ma, Li
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 1357 - 1360
  • [6] Class-wise Deep Dictionary Learning
    Singhal, Vanika
    Khurana, Prerna
    Majumdar, Angshul
    2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 1125 - 1132
  • [7] Unsupervised domain adaptation with deep network based on discriminative class-wise MMD
    Lin, Hsiau-Wen
    Tsai, Yihjia
    Lin, Hwei Jen
    Yu, Chen-Hsiang
    Liu, Meng-Hsing
    AIMS MATHEMATICS, 2024, 9 (03): : 6628 - 6647
  • [8] Class-Wise Fully Convolutional Network for Semantic Segmentation of Remote Sensing Images
    Tian, Tian
    Chu, Zhengquan
    Hu, Qian
    Ma, Li
    REMOTE SENSING, 2021, 13 (16)
  • [9] Kernel class-wise locality preserving projection
    Li, Jun-Bao
    Pan, Jeng-Shyang
    Chu, Shu-Chuan
    INFORMATION SCIENCES, 2008, 178 (07) : 1825 - 1835
  • [10] Constrained class-wise feature selection (CCFS)
    Hussain, Syed Fawad
    Shahzadi, Fatima
    Munir, Badre
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (10) : 3211 - 3224