Redundancy-Aware Pruning of Convolutional Neural Networks

被引:3
|
作者
Xie, Guotian [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510006, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Guangdong Key Lab Informat Secur Technol, Guangzhou 510006, Guangdong, Peoples R China
关键词
Convolutional neural networks - Digital arithmetic - Convolution - Redundancy;
D O I
10.1162/neco_a_01330
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pruning is an effective way to slim and speed up convolutional neural networks. Generally previous work directly pruned neural networks in the original feature space without considering the correlation of neurons. We argue that such a way of pruning still keeps some redundancy in the pruned networks. In this letter, we proposed to prune in the intermediate space in which the correlation of neurons is eliminated. To achieve this goal, the input and output of a convolutional layer are first mapped to an intermediate space by orthogonal transformation. Then neurons are evaluated and pruned in the intermediate space. Extensive experiments have shown that our redundancy-aware pruning method surpasses state-of-the-art pruning methods on both efficiency and accuracy. Notably, using our redundancy-aware pruning method, ResNet models with three times the speed-up could achieve competitive performance with fewer floating point operations per second even compared to DenseNet.
引用
下载
收藏
页码:2482 / 2506
页数:25
相关论文
共 50 条
  • [21] Leveraging Structured Pruning of Convolutional Neural Networks
    Tessier, Hugo
    Gripon, Vincent
    Leonardon, Mathieu
    Arzel, Matthieu
    Bertrand, David
    Hannagan, Thomas
    2022 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2022, : 174 - 179
  • [22] Flattening Layer Pruning in Convolutional Neural Networks
    Jeczmionek, Ernest
    Kowalski, Piotr A.
    SYMMETRY-BASEL, 2021, 13 (07):
  • [23] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [24] Energy-aware redundancy-aware clustering in wireless sensor networks using Spined Loach Searching Optimization
    Sasikala, N.
    Sangaiah, Pavalarajan
    INTERNATIONAL JOURNAL OF COMMUNICATION SYSTEMS, 2023, 36 (03)
  • [25] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [26] Blending Pruning Criteria for Convolutional Neural Networks
    He, Wei
    Huang, Zhongzhan
    Liang, Mingfu
    Liang, Senwei
    Yang, Haizhao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 3 - 15
  • [27] Discriminative Layer Pruning for Convolutional Neural Networks
    Jordao, Artur
    Lie, Maiko
    Schwartz, William Robson
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 828 - 837
  • [28] Designing Energy-Efficient Convolutional Neural Networks using Energy-Aware Pruning
    Yang, Tien-Ju
    Chen, Yu-Hsin
    Sze, Vivienne
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 6071 - 6079
  • [29] Redundancy-aware SOAP messages compression and aggregation for enhanced performance
    Al-Shammary, Dhiah
    Khalil, Ibrahim
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2012, 35 (01) : 365 - 381
  • [30] Partition Pruning: Parallelization-Aware Pruning for Dense Neural Networks
    Shahhosseini, Sina
    Albaqsami, Ahmad
    Jasemi, Masoomeh
    Bagherzadeh, Nader
    2020 28TH EUROMICRO INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND NETWORK-BASED PROCESSING (PDP 2020), 2020, : 307 - 311