Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression

被引:0
|
作者
Wei Yang
Yancai Xiao
机构
[1] Electronic and Control Engineering Beijing Jiaotong University,School of Mechanical
[2] Beijing Jiaotong University,Key Laboratory of Vehicle Advanced Manufacturing, Measuring and Control Technology, Ministry of Education
来源
Applied Intelligence | 2022年 / 52卷
关键词
Convolutional neural network; Model compression; Feature channels similarity; Mutual learning;
D O I
暂无
中图分类号
学科分类号
摘要
The development of convolutional neural network (CNN) have been hindered in resource-constrained devices due to its large memory and calculation. To obtain a light-weight network, we propose feature channels similarity and mutual learning fine tuning (FCS-MLFT) method. To begin with, we focus on the similarity redundancy between the output feature channels of CNN, and propose a novel structured pruning criterion based on the Cosine Similarity, moreover, we use K-Means to cluster the convolution kernels corresponding to the L1 norm of the feature maps into several bins, and calculate the similarity values between feature channels in each bin. Then, different from the traditional method of using the same strategy as the training process to improve the accuracy of the compressed model, we apply mutual learning fine tuning (MLFT) to improve the accuracy of the compact model and the accuracy obtained by the proposed method can achieve the accuracy of the traditional fine tuning (TFT) while significantly shortening the number of epochs. The experimental results not only show the performance of FCS method outperform the existing criteria, such as kernel norm-based and the layer-wise feature norm-based methods, but also prove that MLFT strategy can reduce the number of epochs.
引用
收藏
页码:14560 / 14570
页数:10
相关论文
共 50 条
  • [1] Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression
    Yang, Wei
    Xiao, Yancai
    [J]. APPLIED INTELLIGENCE, 2022, 52 (12) : 14560 - 14570
  • [2] Convolutional neural network simplification via feature map pruning
    Zou, Junhua
    Rui, Ting
    Zhou, You
    Yang, Chengsong
    Zhang, Sai
    [J]. COMPUTERS & ELECTRICAL ENGINEERING, 2018, 70 : 950 - 958
  • [3] Structured feature sparsity training for convolutional neural network compression
    Wang, Wei
    Zhu, Liqiang
    [J]. JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2020, 71
  • [4] ON THE ROLE OF STRUCTURED PRUNING FOR NEURAL NETWORK COMPRESSION
    Bragagnolo, Andrea
    Tartaglione, Enzo
    Fiandrotti, Attilio
    Grangetto, Marco
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 3527 - 3531
  • [5] Is Pruning Compression?: Investigating Pruning Via Network Layer Similarity
    Blakeney, Cody
    Yan, Yan
    Zong, Ziliang
    [J]. 2020 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2020, : 903 - 911
  • [6] Pruning convolutional neural networks via filter similarity analysis
    Lili Geng
    Baoning Niu
    [J]. Machine Learning, 2022, 111 : 3161 - 3180
  • [7] Pruning convolutional neural networks via filter similarity analysis
    Geng, Lili
    Niu, Baoning
    [J]. MACHINE LEARNING, 2022, 111 (09) : 3161 - 3180
  • [8] Dynamic sparsity and model feature learning enhanced training for convolutional neural network-pruning
    Ruan X.
    Hu W.
    Liu Y.
    Li B.
    [J]. Zhongguo Kexue Jishu Kexue/Scientia Sinica Technologica, 2022, 52 (05): : 667 - 681
  • [9] Structured Pruning for Efficient Convolutional Neural Networks via Incremental Regularization
    Wang, Huan
    Hu, Xinyi
    Zhang, Qiming
    Wang, Yuehai
    Yu, Lu
    Hu, Haoji
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 775 - 788
  • [10] EDP: An Efficient Decomposition and Pruning Scheme for Convolutional Neural Network Compression
    Ruan, Xiaofeng
    Liu, Yufan
    Yuan, Chunfeng
    Li, Bing
    Hu, Weiming
    Li, Yangxi
    Maybank, Stephen
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (10) : 4499 - 4513