Neural network pruning based on channel attention mechanism

被引:5
|
作者
Hu, Jianqiang [1 ]
Liu, Yang [1 ]
Wu, Keshou [1 ]
机构
[1] Xiamen Univ Technol, Sch Comp & Informat Engn, Xiamen 361024, Peoples R China
基金
中国国家自然科学基金;
关键词
Convolutional neural networks; channel attention mechanism; Leaky-SE; filter pruning; EFFICIENT;
D O I
10.1080/09540091.2022.2111405
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Network pruning facilitates the deployment of convolutional neural networks in resource-limited environments by reducing redundant parameters. However, most of the existing methods ignore the differences in the contributions of the output feature maps. In response to the above, we propose a novel neural network pruning method based on the channel attention mechanism. In this paper, we firstly utilise the principal component analysis algorithm to reduce the influence of noisy data on feature maps. Then, we propose an improved Leaky-Squeeze-and-Excitation block to evaluate the contribution of each output feature map using the channel attention mechanism. Finally, we effectively remove lower contribution channels without reducing the model performance as much as possible. Extensive experimental results show that our proposed method achieves significant improvements over the state-of-the-art in terms of FLOPs and parameters reduction with similar accuracies. For example, with VGG-16-baseline, our proposed method reduces parameters by 83.3% and FLOPs by 66.3%, with only a loss of 0.13% in top-5 accuracy. Furthermore, it effectively balances pruning efficiency and prediction accuracy.
引用
收藏
页码:2201 / 2218
页数:18
相关论文
共 50 条
  • [1] Differentiable channel pruning guided via attention mechanism: a novel neural network pruning approach
    Hanjing Cheng
    Zidong Wang
    Lifeng Ma
    Zhihui Wei
    Fawaz E. Alsaadi
    Xiaohui Liu
    [J]. Complex & Intelligent Systems, 2023, 9 : 5611 - 5624
  • [2] Differentiable channel pruning guided via attention mechanism: a novel neural network pruning approach
    Cheng, Hanjing
    Wang, Zidong
    Ma, Lifeng
    Wei, Zhihui
    Alsaadi, Fawaz E.
    Liu, Xiaohui
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (05) : 5611 - 5624
  • [3] Channel pruning based on convolutional neural network sensitivity
    Yang, Chenbin
    Liu, Huiyi
    [J]. NEUROCOMPUTING, 2022, 507 : 97 - 106
  • [4] Convolutional Neural Network Channel Pruning Based on Regularized Sparse
    Bao, Chun
    Yu, Chongchong
    Xie, Tao
    Hu, Xinyu
    [J]. 2019 IEEE 4TH INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP 2019), 2019, : 679 - 684
  • [5] Towards performance-maximizing neural network pruning via global channel attention
    Wang, Yingchun
    Guo, Song
    Guo, Jingcai
    Zhang, Jie
    Zhang, Weizhan
    Yan, Caixia
    Zhang, Yuanhong
    [J]. NEURAL NETWORKS, 2024, 171 : 104 - 113
  • [6] Lossless Reconstruction of Convolutional Neural Network for Channel-Based Network Pruning
    Lee, Donghyeon
    Lee, Eunho
    Hwang, Youngbae
    [J]. SENSORS, 2023, 23 (04)
  • [7] A framework for deep neural network multiuser authorization based on channel pruning
    Wang, Linna
    Song, Yunfei
    Zhu, Yujia
    Xia, Daoxun
    Han, Guoquan
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (21):
  • [8] SIECP: Neural Network Channel Pruning based on Sequential Interval Estimation
    Chen, Si-Bao
    Zheng, Yu-Jie
    Ding, Chris H. Q.
    Luo, Bin
    [J]. NEUROCOMPUTING, 2022, 481 : 1 - 10
  • [9] Action Detection Based on 3D Convolution Neural Network with Channel Attention Mechanism
    Gao, Yan
    Liang, Huilai
    Liu, Baodi
    Wang, Yanjiang
    [J]. 2020 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2020, : 602 - 606
  • [10] A Neural Network Based Text Classification with Attention Mechanism
    Lu SiChen
    [J]. PROCEEDINGS OF 2019 IEEE 7TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2019), 2019, : 333 - 338