Magnitude and Uncertainty Pruning Criterion for Neural Networks

被引:0
|
作者
Ko, Vinnie [1 ]
Oehmcke, Stefan [2 ]
Gieseke, Fabian [2 ]
机构
[1] Univ Oslo, Oslo, Norway
[2] Univ Copenhagen, Copenhagen, Denmark
关键词
Neural network compression; pruning; overparameterization; Wald test; MAXIMUM-LIKELIHOOD;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks have achieved dramatic improvements in recent years and depict the state-of-the-art methods for many real-world tasks nowadays. One drawback is, however. that many of these models are overparameterized, which makes them both computationally and memory intensive. Furthermore, overparameterization can also lead to undesired overfilling side-effects. Inspired by recently proposed magnitude-based pruning schemes and the Wald test from the field of statistics, we introduce a novel magnitude and uncertainty (M&U) pruning criterion that helps to lessen such shortcomings. One important. advantage of our M&U pruning criterion is that it is scale-invariant, a phenomenon that the magnitude-based pruning criterion suffers from. In addition, we present a "pseudo bootstrap" scheme, which can efficiently estimate the uncertainty of the weights by using their update information during training. Our experimental evaluation, which is based on various neural network architectures and datasets, shows that our new criterion leads to more compressed models compared to models that are solely based on magnitude-based pruning criteria, with, at the same time, less loss in predictive power.
引用
收藏
页码:2317 / 2326
页数:10
相关论文
共 50 条
  • [1] Gradient and Magnitude Based Pruning for Sparse Deep Neural Networks
    Belay, Kaleab
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 13126 - 13127
  • [2] Speeding-up pruning for Artificial Neural Networks: Introducing Accelerated Iterative Magnitude Pruning
    Zullich, Marco
    Medvet, Eric
    Pellegrino, Felice Andrea
    Ansuini, Alessio
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 3868 - 3875
  • [3] Stage-Wise Magnitude-Based Pruning for Recurrent Neural Networks
    Li, Guiying
    Yang, Peng
    Qian, Chao
    Hong, Richang
    Tang, Ke
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 1666 - 1680
  • [4] Activation-Based Weight Significance Criterion for Pruning Deep Neural Networks
    Dong, Jiayu
    Zheng, Huicheng
    Lian, Lina
    IMAGE AND GRAPHICS (ICIG 2017), PT II, 2017, 10667 : 62 - 73
  • [5] Pruning by explaining: A novel criterion for deep neural network pruning
    Yeom, Seul-Ki
    Seegerer, Philipp
    Lapuschkin, Sebastian
    Binder, Alexander
    Wiedemann, Simon
    Mueller, Klaus-Robert
    Samek, Wojciech
    PATTERN RECOGNITION, 2021, 115
  • [6] CHAMP: Coherent Hardware-Aware Magnitude Pruning of Integrated Photonic Neural Networks
    Banerjee, Sanmitra
    Nikdast, Mahdi
    Pasricha, Sudeep
    Chakrabarty, Krishnendu
    2022 OPTICAL FIBER COMMUNICATIONS CONFERENCE AND EXHIBITION (OFC), 2022,
  • [7] Magnitude and Similarity Based Variable Rate Filter Pruning for Efficient Convolution Neural Networks
    Ghimire, Deepak
    Kim, Seong-Heum
    APPLIED SCIENCES-BASEL, 2023, 13 (01):
  • [8] Filter pruning with a feature map entropy importance criterion for convolution neural networks compressing
    Wang, Jielei
    Jiang, Ting
    Cui, Zongyong
    Cao, Zongjie
    NEUROCOMPUTING, 2021, 461 : 41 - 54
  • [9] Is Complexity Required for Neural Network Pruning? A Case Study on Global Magnitude Pruning
    Gupta, Manas
    Camci, Efe
    Keneta, Vishandi Rudy
    Vaidyanathan, Abhishek
    Kanodia, Ritwik
    James, Ashish
    Foo, Chuan-Sheng
    Wu, Min
    Lin, Jie
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 747 - 754
  • [10] Optimal pruning in neural networks
    Barbato, DML
    Kinouchi, O
    PHYSICAL REVIEW E, 2000, 62 (06): : 8387 - 8394