CHANNEL PRUNING VIA GRADIENT OF MUTUAL INFORMATION FOR LIGHTWEIGHT CONVOLUTIONAL NEURAL NETWORKS

被引:0
|
作者
Lee, Min Kyu [1 ]
Lee, Seunghyun [1 ]
Lee, Sang Hyuk [1 ]
Song, Byung Cheol [1 ]
机构
[1] Inha Univ, Dept Elect Engn, Incheon, South Korea
关键词
convolutional neural network; pruning; model compression; mutual information;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Channel pruning for light-weighting networks is very effective in reducing memory footprint and computational cost. Many channel pruning methods assume that the magnitude of a particular element corresponding to each channel reflects the importance of the channel. Unfortunately, such an assumption does not always hold. To solve this problem, this paper proposes a new method to measure the importance of channels based on gradients of mutual information. The proposed method computes and measures gradients of mutual information during back-propagation by arranging a module capable of estimating mutual information. By using the measured statistics as the importance of the channel, less important channels can be removed. Finally, the fine-tuning enables robust performance restoration of the pruned model. Experimental results show that the proposed method provides better performance with smaller parameter sizes and FLOPs than the conventional schemes.
引用
收藏
页码:1751 / 1755
页数:5
相关论文
共 50 条
  • [41] Pruning feature maps for efficient convolutional neural networks
    Guo, Xiao-ting
    Xie, Xin-shu
    Lang, Xun
    OPTIK, 2023, 281
  • [42] Review of Lightweight Deep Convolutional Neural Networks
    Chen, Fanghui
    Li, Shouliang
    Han, Jiale
    Ren, Fengyuan
    Yang, Zhen
    ARCHIVES OF COMPUTATIONAL METHODS IN ENGINEERING, 2024, 31 (04) : 1915 - 1937
  • [43] Review of Lightweight Deep Convolutional Neural Networks
    Fanghui Chen
    Shouliang Li
    Jiale Han
    Fengyuan Ren
    Zhen Yang
    Archives of Computational Methods in Engineering, 2024, 31 : 1915 - 1937
  • [44] PSE-Net: Channel pruning for Convolutional Neural Networks with parallel-subnets estimator
    Wang, Shiguang
    Xie, Tao
    Liu, Haijun
    Zhang, Xingcheng
    Cheng, Jian
    NEURAL NETWORKS, 2024, 174
  • [45] Channel Pruning for Deep Neural Networks via a Relaxed Groupwise Splitting Method
    Yang, Biao
    Xin, Jack
    Lyu, Jiancheng
    Zhang, Shuai
    Qi, Yingyong
    2019 SECOND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE FOR INDUSTRIES (AI4I 2019), 2019, : 97 - 98
  • [46] Gradual Channel Pruning While Training Using Feature Relevance Scores for Convolutional Neural Networks
    Aketi, Sai Aparna
    Roy, Sourjya
    Raghunathan, Anand
    Roy, Kaushik
    IEEE ACCESS, 2020, 8 : 171924 - 171932
  • [47] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [48] Entropy Induced Pruning Framework for Convolutional Neural Networks
    Lu, Yiheng
    Guan, Ziyu
    Yang, Yaming
    Zhao, Wei
    Gong, Maoguo
    Xu, Cai
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 4, 2024, : 3918 - 3926
  • [49] Review of research on lightweight convolutional neural networks
    Zhou, Yan
    Chen, Shaochang
    Wang, Yiming
    Huan, Wenming
    PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020), 2020, : 1713 - 1720
  • [50] CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics
    Li, Guan
    Wang, Junpeng
    Shen, Han-Wei
    Chen, Kaixin
    Shan, Guihua
    Lu, Zhonghua
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2021, 27 (02) : 1364 - 1373