Filter pruning-based two-step feature map reconstruction

被引:0
|
作者
Yongsheng Liang
Wei Liu
Shuangyan Yi
Huoxiang Yang
Zhenyu He
机构
[1] Harbin Institute of Technology,School of Computer Science and Technology
[2] Peng Cheng Laboratory,Research Center of Networks and Communications
[3] Shenzhen Institute of Information Technology,School of Software Engineering
[4] Shenzhen University,School of Electronics and Information Engineering
来源
关键词
Filter pruning; Channel pruning; Feature map reconstruction; -norm;
D O I
暂无
中图分类号
学科分类号
摘要
In deep neural network compression, channel/filter pruning is widely used for compressing the pre-trained network by judging the redundant channels/filters. In this paper, we propose a two-step filter pruning method to judge the redundant channels/filters layer by layer. The first step is to design a filter selection scheme based on ℓ2,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _{2,1}$$\end{document}-norm by reconstructing the feature map of current layer. More specifically, the filter selection scheme aims to solve a joint ℓ2,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _{2,1}$$\end{document}-norm minimization problem, i.e., both the regularization term and feature map reconstruction error term are constrained by ℓ2,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _{2,1}$$\end{document}-norm. The ℓ2,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _{2,1}$$\end{document}-norm regularization plays a role in the channel/filter selection, while the ℓ2,1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\ell _{2,1}$$\end{document}-norm feature map reconstruction error term plays a role in the robust reconstruction. In this way, the proposed filter selection scheme can learn a column-sparse coefficient representation matrix that can indicate the redundancy of filters. Since pruning the redundant filters in current layer might dramatically influence the output feature map of the following layer, the second step needs to update the filters of the following layer to assure output of feature map approximates to that of baseline. Experimental results demonstrate the effectiveness of this proposed method. For example, our pruned VGG-16 on ImageNet achieves 4×\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$4\times $$\end{document} speedup with 0.95% top-5 accuracy drop. Our pruned ResNet-50 on ImageNet achieves 2×\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2\times $$\end{document} speedup with 1.56% top-5 accuracy drop. Our pruned MobileNet on ImageNet achieves 2×\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$2\times $$\end{document} speedup with 1.20% top-5 accuracy drop.
引用
收藏
页码:1555 / 1563
页数:8
相关论文
共 50 条
  • [21] Manufacturing Quality Prediction Based on Two-step Feature Learning Approach
    Bai, Yun
    Sun, Zhenzhong
    Deng, Jun
    2017 INTERNATIONAL CONFERENCE ON SENSING, DIAGNOSTICS, PROGNOSTICS, AND CONTROL (SDPC), 2017, : 260 - 263
  • [22] A Pruning Method Based on Feature Map Similarity Score
    Cui, Jihua
    Wang, Zhenbang
    Yang, Ziheng
    Guan, Xin
    BIG DATA AND COGNITIVE COMPUTING, 2023, 7 (04)
  • [23] AFMPM: adaptive feature map pruning method based on feature distillation
    Guo, Yufeng
    Zhang, Weiwei
    Wang, Junhuang
    Ji, Ming
    Zhen, Chenghui
    Guo, Zhengzheng
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (02) : 573 - 588
  • [24] AFMPM: adaptive feature map pruning method based on feature distillation
    Yufeng Guo
    Weiwei Zhang
    Junhuang Wang
    Ming Ji
    Chenghui Zhen
    Zhengzheng Guo
    International Journal of Machine Learning and Cybernetics, 2024, 15 : 573 - 588
  • [25] Task's Choice: Pruning-Based Feature Sharing (PBFS) for Multi-Task Learning
    Chen, Ying
    Yu, Jiong
    Zhao, Yutong
    Chen, Jiaying
    Du, Xusheng
    ENTROPY, 2022, 24 (03)
  • [26] HCov: A Target Attention-based Filter Pruning with Retaining High-Covariance Feature Map
    Zhang, Chenrui
    Ma, Yinan
    Wu, Jing
    Long, Chengnian
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [27] Two-step response reconstruction method based on limited measured information
    Shi, Pengcheng
    Peng, Zhenrui
    Dong, Kangli
    Zhendong yu Chongji/Journal of Vibration and Shock, 2022, 41 (11): : 291 - 297
  • [28] Robust Phase Retrieval Algorithm Based on Two-Step Image Reconstruction
    Chen S.-Z.
    Ge M.
    Lian Q.-S.
    Shi B.-S.
    Lian, Qiu-Sheng (lianqs@ysu.edu.cn), 1600, Science Press (40): : 2575 - 2588
  • [29] PROTEIN TRACKING BY CNN-BASED CANDIDATE PRUNING AND TWO-STEP LINKING WITH BAYESIAN NETWORK
    Dmitrieva, Mariia
    Zenner, Helen L.
    Richens, Jennifer
    St Johnston, Daniel
    Rittscher, Jens
    2019 IEEE 29TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2019,
  • [30] Feature-to-Feature Regression for a Two-Step Conditional Independence Test
    Zhang, Qinyi
    Filippi, Sarah
    Flaxman, Seth
    Sejdinovic, Dino
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI2017), 2017,