A Neural Network Pruning Approach based on Compressive Sampling

被引:0
|
作者
Yang, Jie [1 ]
Bouzerdoum, Abdesselam [1 ]
Phung, Son Lam [1 ]
机构
[1] Univ Wollongong, Sch Elect Comp & Telecommun Engn, Wollongong, NSW, Australia
关键词
MULTIPLE-MEASUREMENT VECTORS; SIGNAL RECONSTRUCTION; MATCHING PURSUITS; DICTIONARIES; INFORMATION; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The balance between computational complexity and the architecture bottlenecks the development of Neural Networks (NNs). An architecture that is too large or too small will influence the performance to a large extent in terms of generalization and computational cost. In the past, saliency analysis has been employed to determine the most suitable structure, however, it is time-consuming and the performance is not robust. In this paper, a family of new algorithms for pruning elements (weighs and hidden neurons) in Neural Networks is presented based on Compressive Sampling (CS) theory. The proposed framework makes it possible to locate the significant elements, and hence find a sparse structure, without computing their saliency. Experiment results are presented which demonstrate the effectiveness of the proposed approach.
引用
收藏
页码:3213 / 3220
页数:8
相关论文
共 50 条
  • [1] Neural Behavior-Based Approach for Neural Network Pruning
    Kamma, Koji
    Isoda, Yuki
    Inoue, Sarimu
    Wada, Toshikazu
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2020, E103D (05) : 1135 - 1143
  • [2] A Probabilistic Approach to Neural Network Pruning
    Qian, Xin
    Klabjan, Diego
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [3] Pruning Convolutional Neural Network with Distinctiveness Approach
    Li, Wenrui
    Plested, Jo
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 448 - 455
  • [4] ScoringNet: A Neural Network Based Pruning Criteria for Structured Pruning
    Wang S.
    Zhang Z.
    [J]. Scientific Programming, 2023, 2023
  • [5] A Discriminant Information Approach to Deep Neural Network Pruning
    Hou, Zejiang
    Kung, Sun-Yuan
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9553 - 9560
  • [6] Neural network pruning based on input importance
    Hewahi, Nabil M.
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 37 (02) : 2243 - 2252
  • [7] ARTIFICIAL NEURAL NETWORK DOES BETTER SPATIOTEMPORAL COMPRESSIVE SAMPLING
    Lee, Soo-Young
    Hsu, Charles
    Szu, Harold
    [J]. INDEPENDENT COMPONENT ANALYSES, COMPRESSIVE SAMPLING, WAVELETS, NEURAL NET, BIOSYSTEMS, AND NANOENGINEERING X, 2012, 8401
  • [8] A New Digital Predistortion Based On B spline Function With Compressive Sampling Pruning
    Liu, Cen
    Luo, Laiwei
    Wang, Jun
    Zhang, Chao
    Pan, Changyong
    [J]. 2022 INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING, IWCMC, 2022, : 1200 - 1205
  • [9] Neural network relief: a pruning algorithm based on neural activity
    Dekhovich, Aleksandr
    Tax, David M. J.
    Sluiter, Marcel H. F.
    Bessa, Miguel A.
    [J]. MACHINE LEARNING, 2024, 113 (05) : 2597 - 2618
  • [10] Neural network relief: a pruning algorithm based on neural activity
    Aleksandr Dekhovich
    David M. J. Tax
    Marcel H. F. Sluiter
    Miguel A. Bessa
    [J]. Machine Learning, 2024, 113 : 2597 - 2618