Magnitude and Uncertainty Pruning Criterion for Neural Networks

被引:0
|
作者
Ko, Vinnie [1 ]
Oehmcke, Stefan [2 ]
Gieseke, Fabian [2 ]
机构
[1] Univ Oslo, Oslo, Norway
[2] Univ Copenhagen, Copenhagen, Denmark
关键词
Neural network compression; pruning; overparameterization; Wald test; MAXIMUM-LIKELIHOOD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks have achieved dramatic improvements in recent years and depict the state-of-the-art methods for many real-world tasks nowadays. One drawback is, however. that many of these models are overparameterized, which makes them both computationally and memory intensive. Furthermore, overparameterization can also lead to undesired overfilling side-effects. Inspired by recently proposed magnitude-based pruning schemes and the Wald test from the field of statistics, we introduce a novel magnitude and uncertainty (M&U) pruning criterion that helps to lessen such shortcomings. One important. advantage of our M&U pruning criterion is that it is scale-invariant, a phenomenon that the magnitude-based pruning criterion suffers from. In addition, we present a "pseudo bootstrap" scheme, which can efficiently estimate the uncertainty of the weights by using their update information during training. Our experimental evaluation, which is based on various neural network architectures and datasets, shows that our new criterion leads to more compressed models compared to models that are solely based on magnitude-based pruning criteria, with, at the same time, less loss in predictive power.
引用
收藏
页码:2317 / 2326
页数:10
相关论文
共 50 条
  • [41] Pruning neural networks with distribution estimation algorithms
    Cantú-Paz, E
    GENETIC AND EVOLUTIONARY COMPUTATION - GECCO 2003, PT I, PROCEEDINGS, 2003, 2723 : 790 - 800
  • [42] Activation Pruning of Deep Convolutional Neural Networks
    Ardakani, Arash
    Condo, Carlo
    Gross, Warren J.
    2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 1325 - 1329
  • [43] Fast Convex Pruning of Deep Neural Networks
    Aghasi, Alireza
    Abdi, Afshin
    Romberg, Justin
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2020, 2 (01): : 158 - 188
  • [44] PRUNING ARTIFICIAL NEURAL NETWORKS USING NEURAL COMPLEXITY MEASURES
    Jorgensen, Thomas D.
    Haynes, Barry P.
    Norlund, Charlotte C. F.
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2008, 18 (05) : 389 - 403
  • [45] HYDRA: Pruning Adversarially Robust Neural Networks
    Sehwag, Vikash
    Wang, Shiqi
    Mittal, Prateek
    Jana, Suman
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [46] Blending Pruning Criteria for Convolutional Neural Networks
    He, Wei
    Huang, Zhongzhan
    Liang, Mingfu
    Liang, Senwei
    Yang, Haizhao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT IV, 2021, 12894 : 3 - 15
  • [47] Comparison Analysis for Pruning Algorithms of Neural Networks
    Chen, Xi
    Mao, Jincheng
    Xie, Jian
    2021 2ND INTERNATIONAL CONFERENCE ON COMPUTER ENGINEERING AND INTELLIGENT CONTROL (ICCEIC 2021), 2021, : 50 - 56
  • [48] Discriminative Layer Pruning for Convolutional Neural Networks
    Jordao, Artur
    Lie, Maiko
    Schwartz, William Robson
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 828 - 837
  • [49] Performance criterion of neural networks learning
    CJSC Infozentr, Pushkina Street 199b, Abakan 655017, Russia
    Opt. Mem. Neural Netw. (Inf. Opt.), 2008, 3 (208-219): : 208 - 219
  • [50] Weight Uncertainty in Neural Networks
    Blundell, Charles
    Cornebise, Julien
    Kavukcuoglu, Koray
    Wierstra, Daan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1613 - 1622