Dimensionality reduced training by pruning and freezing parts of a deep neural network: a survey

被引:4
|
作者
Wimmer, Paul [1 ,2 ]
Mehnert, Jens [1 ]
Condurache, Alexandru Paul [1 ,2 ]
机构
[1] Robert Bosch GmbH, Automated Driving Res, Burgenlandstr 44, D-70469 Stuttgart, Germany
[2] Univ Lubeck, Inst Signal Proc, Ratzeburger Allee 160, D-23562 Lubeck, Germany
关键词
Pruning; Freezing; Lottery ticket hypothesis; Dynamic sparse training; Pruning at initialization; EXTREME LEARNING-MACHINE;
D O I
10.1007/s10462-023-10489-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
State-of-the-art deep learning models have a parameter count that reaches into the billions. Training, storing and transferring such models is energy and time consuming, thus costly. A big part of these costs is caused by training the network. Model compression lowers storage and transfer costs, and can further make training more efficient by decreasing the number of computations in the forward and/or backward pass. Thus, compressing networks also at training time while maintaining a high performance is an important research topic. This work is a survey on methods which reduce the number of trained weights in deep learning models throughout the training. Most of the introduced methods set network parameters to zero which is called pruning. The presented pruning approaches are categorized into pruning at initialization, lottery tickets and dynamic sparse training. Moreover, we discuss methods that freeze parts of a network at its random initialization. By freezing weights, the number of trainable parameters is shrunken which reduces gradient computations and the dimensionality of the model's optimization space. In this survey we first propose dimensionality reduced training as an underlying mathematical model that covers pruning and freezing during training. Afterwards, we present and discuss different dimensionality reduced training methods-with a strong focus on unstructured pruning and freezing methods.
引用
下载
收藏
页码:14257 / 14295
页数:39
相关论文
共 50 条
  • [1] Dimensionality reduced training by pruning and freezing parts of a deep neural network: a survey
    Paul Wimmer
    Jens Mehnert
    Alexandru Paul Condurache
    Artificial Intelligence Review, 2023, 56 : 14257 - 14295
  • [2] Pruning and quantization for deep neural network acceleration: A survey
    Liang, Tailin
    Glossner, John
    Wang, Lei
    Shi, Shaobo
    Zhang, Xiaotong
    NEUROCOMPUTING, 2021, 461 : 370 - 403
  • [3] A Survey on Deep Neural Network Pruning: Taxonomy, Comparison, Analysis, and Recommendations
    Cheng, Hongrong
    Zhang, Miao
    Shi, Javen Qinfeng
    IEEE Transactions on Pattern Analysis and Machine Intelligence, 2024, 46 (12) : 10558 - 10578
  • [4] Convolutional Neural Network Pruning: A Survey
    Xu, Sheng
    Huang, Anran
    Chen, Lei
    Zhang, Baochang
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7458 - 7463
  • [5] Pruning by Training: A Novel Deep Neural Network Compression Framework for Image Processing
    Tian, Guanzhong
    Chen, Jun
    Zeng, Xianfang
    Liu, Yong
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 344 - 348
  • [6] Pruning by explaining: A novel criterion for deep neural network pruning
    Yeom, Seul-Ki
    Seegerer, Philipp
    Lapuschkin, Sebastian
    Binder, Alexander
    Wiedemann, Simon
    Mueller, Klaus-Robert
    Samek, Wojciech
    PATTERN RECOGNITION, 2021, 115
  • [7] A pruning algorithm for training neural network ensembles
    Shahjahan, M
    Akhand, MAH
    Murase, K
    SICE 2003 ANNUAL CONFERENCE, VOLS 1-3, 2003, : 628 - 633
  • [8] Pruning the deep neural network by similar function
    Liu, Hanqing
    Xin, Bo
    Mu, Senlin
    Zhu, Zhangqing
    2018 INTERNATIONAL SYMPOSIUM ON POWER ELECTRONICS AND CONTROL ENGINEERING (ISPECE 2018), 2019, 1187
  • [9] Automated Pruning for Deep Neural Network Compression
    Manessi, Franco
    Rozza, Alessandro
    Bianco, Simone
    Napoletano, Paolo
    Schettini, Raimondo
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 657 - 664
  • [10] Overview of Deep Convolutional Neural Network Pruning
    Li, Guang
    Liu, Fang
    Xia, Yuping
    2020 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING AND ARTIFICIAL INTELLIGENCE, 2020, 11584