Pruning by leveraging training dynamics

被引:0
|
作者
Apostol, Andrei C. [1 ,2 ]
Stol, Maarten C. [2 ]
Forre, Patrick [1 ]
机构
[1] Univ Amsterdam, Informat Inst, Amsterdam, Netherlands
[2] BrainCreators BV, Amsterdam, Netherlands
关键词
Deep learning; network pruning; quantization; computer vision;
D O I
10.3233/AIC-210127
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a novel pruning method which uses the oscillations around 0, i.e. sign flips, that a weight has undergone during training in order to determine its saliency. Our method can perform pruning before the network has converged, requires little tuning effort due to having good default values for its hyperparameters, and can directly target the level of sparsity desired by the user. Our experiments, performed on a variety of object classification architectures, show that it is competitive with existing methods and achieves state-of-the-art performance for levels of sparsity of 99.6% and above for 2 out of 3 of the architectures tested. Moreover, we demonstrate that our method is compatible with quantization, another model compression technique.
引用
收藏
页码:65 / 85
页数:21
相关论文
共 50 条
  • [1] Deep Active Learning by Leveraging Training Dynamics
    Wang, Haonan
    Huang, Wei
    Wu, Ziwei
    Margenot, Andrew
    Tong, Hanghang
    He, Jingrui
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Leveraging the Graph Structure of Neural Network Training Dynamics
    Vahedian, Fatemeh
    Li, Ruiyu
    Trivedi, Puja
    Jin, Di
    Koutra, Danai
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4545 - 4549
  • [3] PRUNING AND TRAINING
    Browning, Dominique
    [J]. NEW YORK TIMES BOOK REVIEW, 2013, : 18 - 19
  • [4] Leveraging Structured Pruning of Convolutional Neural Networks
    Tessier, Hugo
    Gripon, Vincent
    Leonardon, Mathieu
    Arzel, Matthieu
    Bertrand, David
    Hannagan, Thomas
    [J]. 2022 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2022, : 174 - 179
  • [5] Leveraging ensemble pruning for imbalanced data classification
    Krawczyk, Bartosz
    Wozniak, Michal
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2018, : 439 - 444
  • [6] A Novel Thought of Pruning Algorithms: Pruning Based on Less Training
    Li, Yue
    Zhao, Weibin
    Shang, Lin
    [J]. PRICAI 2019: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2019, 11671 : 136 - 148
  • [7] ONE-CYCLE PRUNING: PRUNING CONVNETS WITH TIGHT TRAINING BUDGET
    Hubens, Nathan
    Mancas, Matei
    Gosselin, Bernard
    Preda, Marius
    Zaharia, Titus
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2022, : 4128 - 4132
  • [8] Adaptive training and pruning in feedforward networks
    Chang, SJ
    Sum, J
    Wong, KW
    Leung, CS
    [J]. ELECTRONICS LETTERS, 2001, 37 (02) : 106 - 107
  • [9] PRUNING AND TRAINING OF RED DELICIOUS APPLES
    DOZIER, WA
    CARLTON, CC
    SHORT, KC
    GRIFFEY, WA
    BURGESS, HE
    POWELL, AA
    MCGUIRE, J
    [J]. ALABAMA AGRICULTURAL EXPERIMENT STATION BULLETIN, 1980, (519): : 3 - 23
  • [10] THE EFFECT OF LATTICE PRUNING ON MMIE TRAINING
    Qin, Long
    Rudnicky, Alexander
    [J]. 2010 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2010, : 4898 - 4901