Towards efficient filter pruning via topology

被引:0
|
作者
Xiaozhou Xu
Jun Chen
Hongye Su
Lei Xie
机构
[1] Zhejiang University,State Key Laboratory of Industrial Control Technology, College of Control Science and Engineering
来源
关键词
Model compression; Filter pruning; Neural networks; Image classification;
D O I
暂无
中图分类号
学科分类号
摘要
With the development of deep neural networks, compressing and accelerating deep neural networks without performance deterioration has become a research hotspot. Among all kinds of network compression methods, network pruning is one of the most effective and popular methods. Inspired by several property-based pruning methods and geometric topology, we focus the research of the pruning method on the extraction of feature map information. We predefine a metric, called TopologyHole, used to describe the feature map and associate it with the importance of the corresponding filter. In the exploration experiments, we find out that the average TopologyHole of the feature map for the same filter is relatively stable, regardless of the number of image batches the CNNs receive. This phenomenon proves TopologyHole is a data-independent metric and valid as a criterion for filter pruning. Through a large number of experiments, we have demonstrated that priorly pruning the filters with high-TopologyHole feature maps achieves competitive performance compared to the state-of-the-art. Notably, on ImageNet, TopologyHole reduces 45.0%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} FLOPs by removing 40.9%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} parameters on ResNet-50 with 75.71%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document}, only a loss of 0.44%\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\%$$\end{document} in top-1 accuracy.
引用
收藏
页码:639 / 649
页数:10
相关论文
共 50 条
  • [31] Efficient Network Pruning via Feature Selection
    Xie, Xiang
    Chen, Tiantian
    Chu, Anqi
    Stork, Wilhelm
    [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1843 - 1850
  • [32] Network Pruning Towards Highly Efficient RRAM Accelerator
    Peng, Jie
    Li, Zhiwei
    Liu, Haijun
    Huang, Lixing
    Li, Qingjiang
    [J]. IEEE TRANSACTIONS ON NANOTECHNOLOGY, 2022, 21 : 340 - 351
  • [33] Graph Filter: Enabling Efficient Topology Calibration
    Luo, Lailong
    Guo, Deke
    Xu, Jia
    Luo, Xueshan
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2019, 30 (12) : 2730 - 2742
  • [34] Collaborative filter pruning for efficient automatic surface defect detection
    Wang, Haoxuan
    Fan, Xin
    Ling, Pengyang
    Wang, Beng
    Chen, Huaian
    Jin, Yi
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (02) : 2177 - 2189
  • [35] Towards Higher Ranks via Adversarial Weight Pruning
    Tian, Yuchuan
    Chen, Hanting
    Guo, Tianyu
    Xu, Chao
    Wang, Yunhe
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [36] Accelerating Convolutional Networks via Global & Dynamic Filter Pruning
    Lin, Shaohui
    Ji, Rongrong
    Li, Yuchao
    Wu, Yongjian
    Huang, Feiyue
    Zhang, Baochang
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2425 - 2432
  • [37] Pruning convolutional neural networks via filter similarity analysis
    Lili Geng
    Baoning Niu
    [J]. Machine Learning, 2022, 111 : 3161 - 3180
  • [38] Filter Pruning via Learned Representation Median in the Frequency Domain
    Zhang, Xin
    Xie, Weiying
    Li, Yunsong
    Lei, Jie
    Du, Qian
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (05) : 3165 - 3175
  • [39] Filter Pruning via Feature Discrimination in Deep Neural Networks
    He, Zhiqiang
    Qian, Yaguan
    Wang, Yuqi
    Wang, Bin
    Guan, Xiaohui
    Gu, Zhaoquan
    Ling, Xiang
    Zeng, Shaoning
    Wang, Haijiang
    Zhou, Wujie
    [J]. COMPUTER VISION, ECCV 2022, PT XXI, 2022, 13681 : 245 - 261
  • [40] Pruning convolutional neural networks via filter similarity analysis
    Geng, Lili
    Niu, Baoning
    [J]. MACHINE LEARNING, 2022, 111 (09) : 3161 - 3180