Dynamic channel pruning via activation gates

被引:0
|
作者
Shun-Qiang Liu
Yan-Xia Yang
Xue-Jin Gao
Kun Cheng
机构
[1] Beijing University of Technology,Faculty of Life and Environment
[2] Beijing University of Technology,Faculty of Information Technology
[3] Ministry of Education,Engineering Research Center of Digital Community
来源
Applied Intelligence | 2022年 / 52卷
关键词
Channel; Pruning-Dynamic; ReLU-Activation; Gates;
D O I
暂无
中图分类号
学科分类号
摘要
Dynamic channel pruning has been proved to be an effective method by dynamically adjusting the inference path to reduce the computing costs. However, in most existing work, the classification performance decreases rapidly with the increase of pruning rate because their pruning strategy weakens the representation ability of the model to a certain extent. To resolve this problem, a dynamic channel pruning method based on activation gate (DCPAG) is proposed, which can better maintain the classification performance while reducing the computing costs. First, a pipeline aiming for generating pruning strategy, namely channel pruning auxiliary (CPA) is proposed, which considers both the representation ability and computing costs. Second, the pruning strategy generated by CPA is embedded into dynamic rectifying linear unit (DyReLU) to form the embedded dynamic rectifying linear unit (EB-DyReLU), which achieves dynamic channel pruning while maintaining the representation capability. Third, each input sample was self-classified according to its identification difficulty during pruning, and additional training was given to hard samples to achieve better classification performance. Finally, some experiments are carried out on CIFAR-10 and ImageNet respectively to verify the effectiveness of DCPAG in accuracy and floating point of per second (FLOPs). The results show that the proposed method achieves better performance than other similar channel-based methods at the same pruning rate. Specifically, this method not only achieves 0.5-1.5% improvement in classification accuracy, but also reduces the computational costs by 5%-20%.
引用
收藏
页码:16818 / 16831
页数:13
相关论文
共 50 条
  • [31] DAIS: Automatic Channel Pruning via Differentiable Annealing Indicator Search
    Guan, Yushuo
    Liu, Ning
    Zhao, Pengyu
    Che, Zhengping
    Bian, Kaigui
    Wang, Yanzhi
    Tang, Jian
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (12) : 9847 - 9858
  • [32] Automatic channel pruning via clustering and swarm intelligence optimization for CNN
    Jingfei Chang
    Yang Lu
    Ping Xue
    Yiqun Xu
    Zhen Wei
    Applied Intelligence, 2022, 52 : 17751 - 17771
  • [33] Learning Compact Networks via Similarity-aware Channel Pruning
    Zhang, Quan
    Shi, Yemin
    Zhang, Lechun
    Wang, Yaowei
    Tian, Yonghong
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 149 - 152
  • [34] Structural Watermarking to Deep Neural Networks via Network Channel Pruning
    Zhao, Xiangyu
    Yao, Yinzhe
    Wu, Hanzhou
    Zhang, Xinpeng
    2021 IEEE INTERNATIONAL WORKSHOP ON INFORMATION FORENSICS AND SECURITY (WIFS), 2021, : 14 - 19
  • [35] Channel Pruning via Multi-Criteria based on Weight Dependency
    Yan, Yangchun
    Guo, Rongzuo
    Li, Chao
    Yang, Kang
    Xu, Yongjun
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [36] Automatic channel pruning via clustering and swarm intelligence optimization for CNN
    Chang, Jingfei
    Lu, Yang
    Xue, Ping
    Xu, Yiqun
    Wei, Zhen
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17751 - 17771
  • [37] General Pharmacological Activation Mechanism of K+ Channels Bypassing Channel Gates
    Liu, Shijie
    Guo, Peipei
    Wang, Kun
    Zhang, Shaoying
    Li, Ya
    Shen, Juwen
    Mei, Lianghe
    Ye, Yangliang
    Zhang, Qiansen
    Yang, Huaiyu
    JOURNAL OF MEDICINAL CHEMISTRY, 2022, 65 (15) : 10285 - 10299
  • [38] DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution Pruning
    Zheng, Xiawu
    Yang, Chenyi
    Zhang, Shaokun
    Wang, Yan
    Zhang, Baochang
    Wu, Yongjian
    Wu, Yunsheng
    Shao, Ling
    Ji, Rongrong
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2023, 131 (05) : 1234 - 1249
  • [39] Convolutional Neural Network Compression via Dynamic Parameter Rank Pruning
    Sharma, Manish
    Heard, Jamison
    Saber, Eli
    Markopoulos, Panagiotis
    IEEE ACCESS, 2025, 13 : 18441 - 18456
  • [40] Dynamic Network Structured Pruning via Feature Coefficients of Layer Fusion
    Lu H.
    Yuan X.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2019, 32 (11): : 1051 - 1059