CHANNEL PRUNING VIA GRADIENT OF MUTUAL INFORMATION FOR LIGHTWEIGHT CONVOLUTIONAL NEURAL NETWORKS

被引:0
|
作者
Lee, Min Kyu [1 ]
Lee, Seunghyun [1 ]
Lee, Sang Hyuk [1 ]
Song, Byung Cheol [1 ]
机构
[1] Inha Univ, Dept Elect Engn, Incheon, South Korea
关键词
convolutional neural network; pruning; model compression; mutual information;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Channel pruning for light-weighting networks is very effective in reducing memory footprint and computational cost. Many channel pruning methods assume that the magnitude of a particular element corresponding to each channel reflects the importance of the channel. Unfortunately, such an assumption does not always hold. To solve this problem, this paper proposes a new method to measure the importance of channels based on gradients of mutual information. The proposed method computes and measures gradients of mutual information during back-propagation by arranging a module capable of estimating mutual information. By using the measured statistics as the importance of the channel, less important channels can be removed. Finally, the fine-tuning enables robust performance restoration of the pruned model. Experimental results show that the proposed method provides better performance with smaller parameter sizes and FLOPs than the conventional schemes.
引用
收藏
页码:1751 / 1755
页数:5
相关论文
共 50 条
  • [31] Filter pruning by image channel reduction in pre-trained convolutional neural networks
    Gi Su Chung
    Chee Sun Won
    Multimedia Tools and Applications, 2021, 80 : 30817 - 30826
  • [32] GATE TRIMMING: ONE-SHOT CHANNEL PRUNING FOR EFFICIENT CONVOLUTIONAL NEURAL NETWORKS
    Yu, Fang
    Han, Chuanqi
    Wang, Pengcheng
    Huang, Xi
    Cui, Li
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 1365 - 1369
  • [33] Filter pruning by image channel reduction in pre-trained convolutional neural networks
    Chung, Gi Su
    Won, Chee Sun
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (20) : 30817 - 30826
  • [34] Channel pruning based on convolutional neural network sensitivity
    Yang, Chenbin
    Liu, Huiyi
    NEUROCOMPUTING, 2022, 507 : 97 - 106
  • [35] Exploring adversarial examples and adversarial robustness of convolutional neural networks by mutual information
    Zhang J.
    Qian W.
    Cao J.
    Xu D.
    Neural Computing and Applications, 2024, 36 (23) : 14379 - 14394
  • [36] Incremental Filter Pruning via Random Walk for Accelerating Deep Convolutional Neural Networks
    Li, Qinghua
    Li, Cuiping
    Chen, Hong
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 358 - 366
  • [37] LightweightNet: Toward fast and lightweight convolutional neural networks via architecture distillation
    Xu, Ting-Bing
    Yang, Peipei
    Zhang, Xu-Yao
    Liu, Cheng-Lin
    PATTERN RECOGNITION, 2019, 88 : 272 - 284
  • [38] STRUCTURED PRUNING FOR GROUP REGULARIZED CONVOLUTIONAL NEURAL NETWORKS VIA DYNAMIC REGULARIZATION FACTOR
    Li, Feng
    Li, Bo
    Zhu, Meijiao
    Ma, Junchi
    Yuan, Jinlong
    Journal of Industrial and Management Optimization, 2025, 21 (02) : 1440 - 1455
  • [39] Deep Learning for Channel Coding via Neural Mutual Information Estimation
    Fritschek, Rick
    Schaefer, Rafael F.
    Wunder, Gerhard
    2019 IEEE 20TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (SPAWC 2019), 2019,