Redundancy-Aware Pruning of Convolutional Neural Networks

被引:3
|
作者
Xie, Guotian [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510006, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Guangdong Key Lab Informat Secur Technol, Guangzhou 510006, Guangdong, Peoples R China
关键词
D O I
10.1162/neco_a_01330
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pruning is an effective way to slim and speed up convolutional neural networks. Generally previous work directly pruned neural networks in the original feature space without considering the correlation of neurons. We argue that such a way of pruning still keeps some redundancy in the pruned networks. In this letter, we proposed to prune in the intermediate space in which the correlation of neurons is eliminated. To achieve this goal, the input and output of a convolutional layer are first mapped to an intermediate space by orthogonal transformation. Then neurons are evaluated and pruned in the intermediate space. Extensive experiments have shown that our redundancy-aware pruning method surpasses state-of-the-art pruning methods on both efficiency and accuracy. Notably, using our redundancy-aware pruning method, ResNet models with three times the speed-up could achieve competitive performance with fewer floating point operations per second even compared to DenseNet.
引用
收藏
页码:2482 / 2506
页数:25
相关论文
共 50 条
  • [41] Performance Aware Convolutional Neural Network Channel Pruning for Embedded GPUs
    Radu, Valentin
    Kaszyk, Kuba
    Wen, Yuan
    Turner, Jack
    Cano, Jose
    Crowley, Elliot J.
    Franke, Bjoern
    Storkey, Amos
    O'Boyle, Michael
    [J]. PROCEEDINGS OF THE 2019 IEEE INTERNATIONAL SYMPOSIUM ON WORKLOAD CHARACTERIZATION (IISWC 2019), 2019, : 24 - 34
  • [42] Redundancy-Aware and Budget-Feasible Incentive Mechanism in Crowd Sensing
    Li, Juan
    Zhu, Yanmin
    Yu, Jiadi
    [J]. COMPUTER JOURNAL, 2020, 63 (01): : 66 - 79
  • [43] A universal approach for simplified redundancy-aware cross-model querying
    Koupil, Pavel
    Crha, Daniel
    Holubova, Irena
    [J]. INFORMATION SYSTEMS, 2025, 127
  • [44] A Locality Aware Convolutional Neural Networks Accelerator
    Shi, Runbin
    Xu, Zheng
    Sun, Zhihao
    Wu, Di
    Peemen, Maurice
    Li, Ang
    Corporaal, Henk
    [J]. 2015 EUROMICRO CONFERENCE ON DIGITAL SYSTEM DESIGN (DSD), 2015, : 591 - 598
  • [45] Content-aware convolutional neural networks
    Guo, Yong
    Chen, Yaofo
    Tan, Mingkui
    Jia, Kui
    Chen, Jian
    Wang, Jingdong
    [J]. NEURAL NETWORKS, 2021, 143 : 657 - 668
  • [46] Structure-Aware Convolutional Neural Networks
    Chang, Jianlong
    Gu, Jie
    Wang, Lingfeng
    Meng, Gaofeng
    Xiang, Shiming
    Pan, Chunhong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [47] Frequency-Domain Dynamic Pruning for Convolutional Neural Networks
    Liu, Zhenhua
    Xu, Jizheng
    Peng, Xiulian
    Xiong, Ruiqin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [48] Entropy-based pruning method for convolutional neural networks
    Hur, Cheonghwan
    Kang, Sanggil
    [J]. JOURNAL OF SUPERCOMPUTING, 2019, 75 (06): : 2950 - 2963
  • [49] Cross-Entropy Pruning for Compressing Convolutional Neural Networks
    Bao, Rongxin
    Yuan, Xu
    Chen, Zhikui
    Ma, Ruixin
    [J]. NEURAL COMPUTATION, 2018, 30 (11) : 3128 - 3149
  • [50] Pruning Deep Convolutional Neural Networks Architectures with Evolution Strategy
    Fernandes, Francisco E., Jr.
    Yen, Gary G.
    [J]. INFORMATION SCIENCES, 2021, 552 : 29 - 47