CHANNEL PRUNING VIA GRADIENT OF MUTUAL INFORMATION FOR LIGHTWEIGHT CONVOLUTIONAL NEURAL NETWORKS

被引:0
|
作者
Lee, Min Kyu [1 ]
Lee, Seunghyun [1 ]
Lee, Sang Hyuk [1 ]
Song, Byung Cheol [1 ]
机构
[1] Inha Univ, Dept Elect Engn, Incheon, South Korea
关键词
convolutional neural network; pruning; model compression; mutual information;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Channel pruning for light-weighting networks is very effective in reducing memory footprint and computational cost. Many channel pruning methods assume that the magnitude of a particular element corresponding to each channel reflects the importance of the channel. Unfortunately, such an assumption does not always hold. To solve this problem, this paper proposes a new method to measure the importance of channels based on gradients of mutual information. The proposed method computes and measures gradients of mutual information during back-propagation by arranging a module capable of estimating mutual information. By using the measured statistics as the importance of the channel, less important channels can be removed. Finally, the fine-tuning enables robust performance restoration of the pruned model. Experimental results show that the proposed method provides better performance with smaller parameter sizes and FLOPs than the conventional schemes.
引用
收藏
页码:1751 / 1755
页数:5
相关论文
共 50 条
  • [1] Channel pruning based on mean gradient for accelerating Convolutional Neural Networks
    Liu, Congcong
    Wu, Huaming
    SIGNAL PROCESSING, 2019, 156 : 84 - 91
  • [2] Accelerating Convolutional Neural Networks with Dynamic Channel Pruning
    Zhang, Chiliang
    Hu, Tao
    Guan, Yingda
    Ye, Zuochang
    2019 DATA COMPRESSION CONFERENCE (DCC), 2019, : 563 - 563
  • [3] Loss-Driven Channel Pruning of Convolutional Neural Networks
    Long, Xin
    Zeng, Xiangrong
    Chen, Chen
    Xiao, Huaxin
    Zhang, Maojun
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2020, E103D (05) : 1190 - 1194
  • [4] Pruning convolutional neural networks via filter similarity analysis
    Lili Geng
    Baoning Niu
    Machine Learning, 2022, 111 : 3161 - 3180
  • [5] Pruning convolutional neural networks via filter similarity analysis
    Geng, Lili
    Niu, Baoning
    MACHINE LEARNING, 2022, 111 (09) : 3161 - 3180
  • [6] RepSGD: Channel Pruning Using Reparamerization for Accelerating Convolutional Neural Networks
    Kim, Nam Joon
    Kim, Hyun
    2023 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS, 2023,
  • [7] Compression of Deep Convolutional Neural Networks Using Effective Channel Pruning
    Guo, Qingbei
    Wu, Xiao-Jun
    Zhao, Xiuyang
    IMAGE AND GRAPHICS, ICIG 2019, PT I, 2019, 11901 : 760 - 772
  • [8] A Lightweight Block With Information Flow Enhancement for Convolutional Neural Networks
    Bao, Zhiqiang
    Yang, Shunzhi
    Huang, Zhenhua
    Zhou, MengChu
    Chen, Yunwen
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (08) : 3570 - 3584
  • [9] Generalized Gradient Flow Based Saliency for Pruning Deep Convolutional Neural Networks
    Xinyu Liu
    Baopu Li
    Zhen Chen
    Yixuan Yuan
    International Journal of Computer Vision, 2023, 131 : 3121 - 3135
  • [10] FILTER PRUNING BASED ON LOCAL GRADIENT ACTIVATION MAPPING IN CONVOLUTIONAL NEURAL NETWORKS
    Intraraprasit, Monthon
    Chitsobhuk, Orachat
    INTERNATIONAL JOURNAL OF INNOVATIVE COMPUTING INFORMATION AND CONTROL, 2023, 19 (06): : 1697 - 1715