Channel pruning guided by spatial and channel attention for DNNs in intelligent edge computing

被引:15
|
作者
Liu, Mengran [1 ]
Fang, Weiwei [1 ,2 ]
Ma, Xiaodong [1 ]
Xu, Wenyuan [1 ]
Xiong, Naixue [3 ]
Ding, Yi [4 ]
机构
[1] Beijing Jiaotong Univ, Sch Comp & Informat Technol, Beijing 100044, Peoples R China
[2] Minist Educ, Key Lab Ind Internet Things & Networked Control, Chongqing 400065, Peoples R China
[3] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300350, Peoples R China
[4] Beijing Wuzi Univ, Sch Informat, Beijing 101149, Peoples R China
关键词
Deep neural networks; Model compression; Channel pruning; Attention module; NEURAL-NETWORKS;
D O I
10.1016/j.asoc.2021.107636
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep Neural Networks (DNNs) have achieved remarkable success in many computer vision tasks recently, but the huge number of parameters and the high computation overhead hinder their deployments on resource-constrained edge devices. It is worth noting that channel pruning is an effective approach for compressing DNN models. A critical challenge is to determine which channels are to be removed, so that the model accuracy will not be negatively affected. In this paper, we first propose Spatial and Channel Attention (SCA), a new attention module combining both spatial and channel attention that respectively focuses on "where"and "what"are the most informative parts. Guided by the scale values generated by SCA for measuring channel importance, we further propose a new channel pruning approach called Channel Pruning guided by Spatial and Channel Attention (CPSCA). Experimental results indicate that SCA achieves the best inference accuracy, while incurring negligibly extra resource consumption, compared to other state-of-the-art attention modules. Our evaluation on two benchmark datasets shows that, with the guidance of SCA, our CPSCA approach achieves higher inference accuracy than other state-of-the-art pruning methods under the same pruning ratios. (C) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:10
相关论文
共 50 条
  • [11] Neural network pruning based on channel attention mechanism
    Hu, Jianqiang
    Liu, Yang
    Wu, Keshou
    CONNECTION SCIENCE, 2022, 34 (01) : 2201 - 2218
  • [12] CHANNEL PRUNING VIA ATTENTION MODULE AND MEMORY CURVE
    Li, Hufei
    Cao, Jian
    Liu, Xiangcheng
    Chen, Jue
    Shang, Jingjie
    Qian, Yu
    Wang, Yuan
    2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 1985 - 1989
  • [13] Channel Pruning Guided by Classification Loss and Feature Importance
    Guo, Jinyang
    Ouyang, Wanli
    Xu, Dong
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 10885 - 10892
  • [14] Identification of plant leaf diseases by deep learning based on channel attention and channel pruning
    Chen, Riyao
    Qi, Haixia
    Liang, Yu
    Yang, Mingchao
    FRONTIERS IN PLANT SCIENCE, 2022, 13
  • [15] Lightweight Vehicle Point Cloud Completion Network Combined with Channel Pruning and Channel Attention
    Yang, Xiaowen
    Feng, Bodong
    Han, Huiyan
    Kuang, Liqun
    Han, Xie
    He, Ligang
    Computer Engineering and Applications, 2025, 61 (01) : 232 - 242
  • [16] Intelligent Channel Estimation Based on Edge Computing for C-V2I
    Liao Y.
    Tian X.-Y.
    Cai Z.-R.
    Hua Y.-X.
    Han Q.-W.
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2021, 49 (05): : 833 - 842
  • [17] Pruning Attention in Transformers for Nonlinear Channel Compensation in Optical Systems
    Hamgini, Behnam Behinaein
    Najafi, Hossein
    Bakhshali, Ali
    Zhang, Zhuhong
    2024 OPTICAL FIBER COMMUNICATIONS CONFERENCE AND EXHIBITION, OFC, 2024,
  • [18] FPC: Feature Map Pruning using Channel Attention Mechanism
    Liu, Yang
    Hu, Jianqiang
    Zhou, Xiaobao
    Wu, Jiaxin
    INTERNATIONAL CONFERENCE ON INTELLIGENT TRAFFIC SYSTEMS AND SMART CITY (ITSSC 2021), 2022, 12165
  • [19] A Novel Channel Pruning Compression Algorithm Combined with an Attention Mechanism
    Zhao, Ming
    Luo, Tie
    Peng, Sheng-Lung
    Tan, Junbo
    ELECTRONICS, 2023, 12 (07)
  • [20] ARPruning: An automatic channel pruning based on attention map ranking
    Yuan, Tongtong
    Li, Zulin
    Liu, Bo
    Tang, Yinan
    Liu, Yujia
    NEURAL NETWORKS, 2024, 174