Filter pruning by quantifying feature similarity and entropy of feature maps

被引:12
|
作者
Liu, Yajun [1 ]
Fan, Kefeng [2 ]
Wu, Dakui [1 ]
Zhou, Wenju [1 ]
机构
[1] Shanghai Univ, Sch Mechatron Engn & Automat, Shanghai 200444, Peoples R China
[2] China Elect Standardizat Inst, Beijing 100007, Peoples R China
关键词
Filter pruning; Feature similarity (FSIM); Two-dimensional entropy (2D entropy); Feature maps; MODEL;
D O I
10.1016/j.neucom.2023.126297
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Filter pruning can effectively reduce the time cost and computing resources of convolutional neural net-works (CNNs), and is well applied to lightweight edge devices. However, most of the current pruning methods focus on the inherent properties of the filters themselves to prune the network, and pay less attention to the connection between the filters and the feature maps. Feature similarity (FSIM) utilizes the fact that the human visual system is more sensitive to the underlying features of the images to more accurately assess image quality. We discover that FSIM is also suitable for evaluating feature maps of CNNs. In addition, the information richness in the feature maps reflects the degree of importance of the filters. Based on the above research, we propose to quantify the importance of feature maps with FSIM and two-dimensional entropy (2D Entropy) indicator to further guide filter pruning (FSIM-E). The FSIM-E is executed on CIFAR-10 and ILSVRC-2012 to demonstrate that FSIM-E can effectively compress and accelerate the network model. For example, for ResNet-110 on CIFAR-10, FSIM-E prunes 71.1% of the FLOPs and 66.5% of the parameters, while improving the accuracy by 0.1%. With ResNet-50, FSIM-E can achieve 57.2% pruning rate of FLOPs and 53.1% pruning rate of parameters on ILSVRC-2012 with loss of only 0.42% of Top-5 accuracy. (c) 2023 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Pruning Filter via Gaussian Distribution Feature for Deep Neural Networks Acceleration
    Xu, Jianrong
    Diao, Boyu
    Cui, Bifeng
    Yang, Kang
    Li, Chao
    Hong, Hailong
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [42] Filter pruning-based two-step feature map reconstruction
    Yongsheng Liang
    Wei Liu
    Shuangyan Yi
    Huoxiang Yang
    Zhenyu He
    Signal, Image and Video Processing, 2021, 15 : 1555 - 1563
  • [43] Filter feature
    Bigham, Roy
    Pollution Engineering, 2013, 45 (08) : 21 - 23
  • [44] Unified Entropy in Self-organizing Feature Maps Neural Network
    Zhu, Chunyang
    PROCEEDINGS OF THE 2017 2ND INTERNATIONAL SYMPOSIUM ON ADVANCES IN ELECTRICAL, ELECTRONICS AND COMPUTER ENGINEERING (ISAEECE 2017), 2017, 124 : 14 - 22
  • [45] lAKECP: Adaptive Knowledge Extraction from Feature Maps for Fast and Efficient Channel Pruning
    Zhang, Haonan
    Liu, Longjun
    Zhou, Hengyi
    Hou, Wenxuan
    Sun, Hongbin
    Zheng, Nanning
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 648 - 657
  • [46] A novel measure for quantifying the topology preservation of self-organizing feature maps
    Su, MC
    Chang, HT
    Chou, CH
    NEURAL PROCESSING LETTERS, 2002, 15 (02) : 137 - 145
  • [47] PROTEINOTOPIC FEATURE MAPS
    MERELO, JJ
    ANDRADE, MA
    PRIETO, A
    MORAN, F
    NEUROCOMPUTING, 1994, 6 (04) : 443 - 454
  • [48] A Novel Measure for Quantifying the Topology Preservation of Self-Organizing Feature Maps
    Mu-Chun Su
    Hsiao-Te Chang
    Chien-Hsing Chou
    Neural Processing Letters, 2002, 15 : 137 - 145
  • [49] Attention Module Based on Feature Similarity and Feature Normalization
    Du, Qiliang
    Wang, Yimin
    Tian, Lianfang
    Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 2024, 52 (07): : 62 - 71
  • [50] Averaging feature maps
    Lewis, T
    Owens, R
    Baddeley, A
    PATTERN RECOGNITION, 1999, 32 (09) : 1615 - 1630