A Computationally Efficient Weight Pruning Algorithm for Artificial Neural Network Classifiers

被引:0
|
作者
机构
[1] Thapar University,Electronics and Communication Engineering Department
关键词
Weight pruning; Artificial neural network; Backpropagation; Complexity penalty; Fisher information; Pattern classification;
D O I
暂无
中图分类号
学科分类号
摘要
A novel technique is being proposed to prune the weights of artificial neural networks (ANNs) while training with backpropagation algorithm. Iterative update of weights through gradient descent mechanism does not guarantee convergence in a specified number of epochs. Pruning of non-relevant weights not only reduces the computational complexity but also improves the classification performance. This algorithm first defines the “relevance” of initialized weights in a statistical sense by introducing a coefficient of dominance for each weight converging on a hidden node and subsequently employing the concept of complexity penalty. Based upon complexity penalty for each weight, a decision has been taken to either prune or retain the weight. It has been shown analytically that a weight with higher complexity penalty has a high degree of Fisher information which further implies its ability to capture the variations in the input set for better classification. Simulation experiments performed with five benchmark data sets reveal that ANNs trained after being pruned using the proposed technique exhibit higher convergence, lower execution time and higher success rate in the test phase and yields substantial reduction in computational resources. For complex architectures, early convergence was found to be directly correlated with percentage of weights pruned. The efficacy of the technique has been validated on several benchmark datasets having large diversity of attributes.
引用
收藏
页码:6787 / 6799
页数:12
相关论文
共 50 条
  • [1] A Computationally Efficient Weight Pruning Algorithm for Artificial Neural Network Classifiers
    Sakshi
    Kumar, Ravi
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2018, 43 (12) : 6787 - 6799
  • [2] DANNP: an efficient artificial neural network pruning tool
    Alshahrani, Mona
    Soufan, Othman
    Magana-Mora, Arturo
    Bajic, Vladimir B.
    PEERJ COMPUTER SCIENCE, 2017,
  • [3] An Efficient Pruning and Weight Sharing method for Neural Network
    Kim, Jin-Kyu
    Lee, Mi-Young
    Kim, Ju-Yeob
    Kim, Byung-Jo
    Lee, Joo-Hyun
    2016 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS-ASIA (ICCE-ASIA), 2016,
  • [4] Efficient variable selection batch pruning algorithm for artificial neural networks
    Kovalishyn, Vasyl
    Poda, Gennady
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2015, 149 : 10 - 16
  • [5] A formal selection and pruning algorithm for feedforward artificial neural network optimization
    Ponnapalli, PVS
    Ho, KC
    Thomson, M
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (04): : 964 - 968
  • [6] Artificial Neural Network Pruning to Extract Knowledge
    Mirkes, Evgeny M.
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [7] Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm
    Good, Aidan
    Lin, Jiaqi
    Yu, Xin
    Sieg, Hannah
    Ferguson, Mikey
    Zhe, Shandian
    Wieczorek, Jerzy
    Serra, Thiago
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [8] Pruning of a network of simple classifiers
    Hashemian, P
    IC-AI '04 & MLMTA'04 , VOL 1 AND 2, PROCEEDINGS, 2004, : 641 - 647
  • [9] A pruning algorithm for training neural network ensembles
    Shahjahan, M
    Akhand, MAH
    Murase, K
    SICE 2003 ANNUAL CONFERENCE, VOLS 1-3, 2003, : 628 - 633
  • [10] An Improved Pruning Algorithm for Fuzzy Neural Network
    Ai Fangju
    INFORMATION TECHNOLOGY APPLICATIONS IN INDUSTRY II, PTS 1-4, 2013, 411-414 : 2031 - 2036