Magnitude and Uncertainty Pruning Criterion for Neural Networks

被引:0
|
作者
Ko, Vinnie [1 ]
Oehmcke, Stefan [2 ]
Gieseke, Fabian [2 ]
机构
[1] Univ Oslo, Oslo, Norway
[2] Univ Copenhagen, Copenhagen, Denmark
关键词
Neural network compression; pruning; overparameterization; Wald test; MAXIMUM-LIKELIHOOD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks have achieved dramatic improvements in recent years and depict the state-of-the-art methods for many real-world tasks nowadays. One drawback is, however. that many of these models are overparameterized, which makes them both computationally and memory intensive. Furthermore, overparameterization can also lead to undesired overfilling side-effects. Inspired by recently proposed magnitude-based pruning schemes and the Wald test from the field of statistics, we introduce a novel magnitude and uncertainty (M&U) pruning criterion that helps to lessen such shortcomings. One important. advantage of our M&U pruning criterion is that it is scale-invariant, a phenomenon that the magnitude-based pruning criterion suffers from. In addition, we present a "pseudo bootstrap" scheme, which can efficiently estimate the uncertainty of the weights by using their update information during training. Our experimental evaluation, which is based on various neural network architectures and datasets, shows that our new criterion leads to more compressed models compared to models that are solely based on magnitude-based pruning criteria, with, at the same time, less loss in predictive power.
引用
收藏
页码:2317 / 2326
页数:10
相关论文
共 50 条
  • [31] On rule pruning using fuzzy neural networks
    Pal, NR
    Pal, T
    FUZZY SETS AND SYSTEMS, 1999, 106 (03) : 335 - 347
  • [32] An iterative pruning algorithm for feedforward neural networks
    Castellano, G
    Fanelli, AM
    Pelillo, M
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1997, 8 (03): : 519 - 531
  • [33] Evolving Better Initializations For Neural Networks With Pruning
    Zhou, Ryan
    Hu, Ting
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 703 - 706
  • [34] Automated Pruning of Neural Networks for Mobile Applications
    Glinserer, Andreas
    Lechner, Martin
    Wendt, Alexander
    2021 IEEE 19TH INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN), 2021,
  • [35] Pruning algorithms of neural networks - a comparative study
    Augasta, M. Gethsiyal
    Kathirvalavakumar, T.
    OPEN COMPUTER SCIENCE, 2013, 3 (03) : 105 - 115
  • [36] Online training and pruning of photonic neural networks
    Zhang, Weipeng
    Xu, Tengji
    Zhang, Jiawei
    Shastri, Bhavin J.
    Huang, Chaoran
    Prucnal, Paul
    2023 IEEE PHOTONICS CONFERENCE, IPC, 2023,
  • [37] Pruning neural networks for inductive conformal prediction
    Zhao, Xindi
    Bellotti, Anthony
    CONFORMAL AND PROBABILISTIC PREDICTION WITH APPLICATIONS, VOL 179, 2022, 179
  • [38] Pruning Coherent Integrated Photonic Neural Networks
    Banerjee, Sanmitra
    Nikdast, Mahdi
    Pasricha, Sudeep
    Chakrabarty, Krishnendu
    IEEE JOURNAL OF SELECTED TOPICS IN QUANTUM ELECTRONICS, 2023, 29 (02)
  • [39] Pruning Approaches for Selection of Neural Networks Structure
    Abid, Slim
    Chtourou, Mohamed
    Djemel, Mohamed
    2013 10TH INTERNATIONAL MULTI-CONFERENCE ON SYSTEMS, SIGNALS & DEVICES (SSD), 2013,
  • [40] Associative memory by neural networks with delays and pruning
    Miyoshi, S
    Okada, M
    ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE, 2003, 86 (06): : 48 - 58