On the Information of Feature Maps and Pruning of Deep Neural Networks

被引:3
|
作者
Soltani, Mohammadreza [1 ]
Wu, Suya [1 ]
Ding, Jie [2 ]
Ravier, Robert [1 ]
Tarokh, Vahid [1 ]
机构
[1] Duke Univ, Dept Elect & Comp Engn, Durham, NC 27706 USA
[2] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
关键词
Deep neural compression; mutual information; feature maps;
D O I
10.1109/ICPR48806.2021.9412579
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A technique for compressing deep neural models achieving competitive performance to state-of-the-art methods is proposed. The approach utilizes the mutual information between the feature maps and the output of the model in order to prune the redundant layers of the network. Extensive numerical experiments on both CIFAR-10, CIFAR-100, and Tiny ImageNet data sets demonstrate that the proposed method can be effective in compressing deep models, both in terms of the numbers of parameters and operations. For instance, by applying the proposed approach to DenseNet model with 0.77 million parameters and 293 million operations for classification of CIFAR-10 data set, a reduction of 62.66% and 41.00% in the number of parameters and the number of operations are respectively achieved, while increasing the test error only by less than 1%.
引用
收藏
页码:6988 / 6995
页数:8
相关论文
共 50 条
  • [1] Pruning feature maps for efficient convolutional neural networks
    Guo, Xiao-ting
    Xie, Xin-shu
    Lang, Xun
    [J]. OPTIK, 2023, 281
  • [2] Redundant feature pruning for accelerated inference in deep neural networks
    Ayinde, Babajide O.
    Inanc, Tamer
    Zurada, Jacek M.
    [J]. NEURAL NETWORKS, 2019, 118 : 148 - 158
  • [3] Filter Pruning via Feature Discrimination in Deep Neural Networks
    He, Zhiqiang
    Qian, Yaguan
    Wang, Yuqi
    Wang, Bin
    Guan, Xiaohui
    Gu, Zhaoquan
    Ling, Xiang
    Zeng, Shaoning
    Wang, Haijiang
    Zhou, Wujie
    [J]. COMPUTER VISION, ECCV 2022, PT XXI, 2022, 13681 : 245 - 261
  • [4] On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections
    Soltani, Mohammadreza
    Wu, Suya
    Li, Yuerong
    Ding, Jie
    Tarokh, Vahid
    [J]. DCC 2022: 2022 DATA COMPRESSION CONFERENCE (DCC), 2022, : 482 - 482
  • [5] Pruning Filter via Gaussian Distribution Feature for Deep Neural Networks Acceleration
    Xu, Jianrong
    Diao, Boyu
    Cui, Bifeng
    Yang, Kang
    Li, Chao
    Hong, Hailong
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [6] Methods for Pruning Deep Neural Networks
    Vadera, Sunil
    Ameen, Salem
    [J]. IEEE ACCESS, 2022, 10 : 63280 - 63300
  • [7] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    [J]. ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [8] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    [J]. 2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [9] Fast Convex Pruning of Deep Neural Networks
    Aghasi, Alireza
    Abdi, Afshin
    Romberg, Justin
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 158 - 188
  • [10] Automatic Pruning Rate Derivation for Structured Pruning of Deep Neural Networks
    Sakai, Yasufumi
    Iwakawa, Akinori
    Tabaru, Tsuguchika
    Inoue, Atsuki
    Kawaguchi, Hiroshi
    [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 2561 - 2567