Overview of Deep Convolutional Neural Network Pruning

被引:1
|
作者
Li, Guang [1 ]
Liu, Fang [1 ]
Xia, Yuping [1 ]
机构
[1] Natl Univ Def Technol, Coll Elect Sci & Technol, Natl Key Lab Sci & Technol Automat Target Recogni, Changsha 410073, Peoples R China
关键词
neural network; model compression; network pruning;
D O I
10.1117/12.2580086
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, due to the rapid development of deep convolutional neural networks, deep learning model inference needs to consume a lot of computing resources. Most current edge devices cannot support deep learning applications with low latency, low power consumption, and high accuracy due to limited resources. Deep learning applications. Therefore, model compression and acceleration of deep networks are an effective solution, and network pruning that simplifies the model by removing redundant parameters in the inference stage is a hot research in this field in recent years. This paper divides the work into six aspects for a detailed analysis, combs the latest progress of deep neural network pruning technology from the perspective of different granular pruning and weight measurement standards, and finally points out the problems in the current research and analyzes Future research directions in the field of pruning.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Variational Convolutional Neural Network Pruning
    Zhao, Chenglong
    Ni, Bingbing
    Zhang, Jian
    Zhao, Qiwei
    Zhang, Wenjun
    Tian, Qi
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2775 - 2784
  • [2] Convolutional Neural Network Pruning: A Survey
    Xu, Sheng
    Huang, Anran
    Chen, Lei
    Zhang, Baochang
    [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7458 - 7463
  • [3] An FPGA Realization of a Deep Convolutional Neural Network Using a Threshold Neuron Pruning
    Fujii, Tomoya
    Sato, Simpei
    Nakahara, Hiroki
    Motomura, Masato
    [J]. APPLIED RECONFIGURABLE COMPUTING, 2017, 10216 : 268 - 280
  • [4] Structured Pruning of Deep Convolutional Neural Networks
    Anwar, Sajid
    Hwang, Kyuyeon
    Sung, Wonyong
    [J]. ACM JOURNAL ON EMERGING TECHNOLOGIES IN COMPUTING SYSTEMS, 2017, 13 (03)
  • [5] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    [J]. 2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [6] Pruning Convolutional Neural Network with Distinctiveness Approach
    Li, Wenrui
    Plested, Jo
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 448 - 455
  • [7] Rethinking the Pruning Criteria for Convolutional Neural Network
    Huang, Zhongzhan
    Shao, Wenqi
    Wang, Xinjiang
    Lin, Liang
    Luo, Ping
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [8] Thinning of convolutional neural network with mixed pruning
    Yang, Wenzhu
    Jin, Lilei
    Wang, Sile
    Cu, Zhenchao
    Chen, Xiangyang
    Chen, Liping
    [J]. IET IMAGE PROCESSING, 2019, 13 (05) : 779 - 784
  • [9] Structured Pruning for Deep Convolutional Neural Networks: A Survey
    He, Yang
    Xiao, Lingao
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (05) : 2900 - 2919
  • [10] Deep Convolutional Neural Network
    Zhou, Yu
    Fang, Rui
    Liu, Peng
    Liu, Kai
    [J]. 2019 PROCEEDINGS OF THE CONFERENCE ON CONTROL AND ITS APPLICATIONS, CT, 2019, : 46 - 51