Convolutional neural network pruning based on misclassification cost

被引:1
|
作者
Ahmadluei, Saeed [1 ]
Faez, Karim [2 ]
Masoumi, Behrooz [1 ]
机构
[1] Islamic Azad Univ, Dept Comp & Informat Technol Engn, Qazvin Branch, Qazvin, Iran
[2] Amirkabir Univ Technol, Elect Engn Dept, Hafez Ave, Tehran 1591634311, Iran
来源
JOURNAL OF SUPERCOMPUTING | 2023年 / 79卷 / 18期
关键词
Convolutional neural networks; Pruning; Misclassification cost; DEEP; CLASSIFICATION;
D O I
10.1007/s11227-023-05487-7
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In a convolutional neural network (CNN), overparameterization increases the risk of overfitting, decelerates the inference, and impedes edge computing. To resolve these challenges, one possible solution is to prune CNN parameters. The essence of pruning is to identify and eliminate unimportant filters, which should yield the highest speed increase and the lowest accuracy loss. In contrast with other pruning methods and in conformity with the real-world, this paper does not evaluate the accuracy of a CNN as its overall performance but analyzes different misclassification costs. This modification accelerates the pruning process and improves the prune ratio. The proposed algorithm determines the expected specificity/sensitivity for each class and finds the smallest CNN that is consistent with them. The layer-wise relevance propagation is employed to measure the contribution of each filter to every class discrimination. The importance of each filter is determined by integrating its local (usefulness in its layer) and global (contribution to the network output) usefulness. Since the proposed algorithm frequently fluctuates between pruning and recovery, further fine-tuning is unnecessary. According to simulation results, the proposed algorithm was efficient in both pruning a CNN and attaining the desired sensitivity/specificity of classes.
引用
收藏
页码:21185 / 21234
页数:50
相关论文
共 50 条
  • [1] Convolutional neural network pruning based on misclassification cost
    Saeed Ahmadluei
    Karim Faez
    Behrooz Masoumi
    [J]. The Journal of Supercomputing, 2023, 79 : 21185 - 21234
  • [2] Channel pruning based on convolutional neural network sensitivity
    Yang, Chenbin
    Liu, Huiyi
    [J]. NEUROCOMPUTING, 2022, 507 : 97 - 106
  • [3] Variational Convolutional Neural Network Pruning
    Zhao, Chenglong
    Ni, Bingbing
    Zhang, Jian
    Zhao, Qiwei
    Zhang, Wenjun
    Tian, Qi
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2775 - 2784
  • [4] Convolutional Neural Network Pruning: A Survey
    Xu, Sheng
    Huang, Anran
    Chen, Lei
    Zhang, Baochang
    [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7458 - 7463
  • [5] Pruning algorithm of convolutional neural network based on optimal threshold
    Wang, Jianjun
    Liu, Leshan
    Pan, Ximeng
    [J]. 2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 50 - 54
  • [6] Convolutional neural network acceleration algorithm based on filters pruning
    Li, Hao
    Zhao, Wen-Jie
    Han, Bo
    [J]. Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2019, 53 (10): : 1994 - 2002
  • [7] Convolutional Neural Network Channel Pruning Based on Regularized Sparse
    Bao, Chun
    Yu, Chongchong
    Xie, Tao
    Hu, Xinyu
    [J]. 2019 IEEE 4TH INTERNATIONAL CONFERENCE ON SIGNAL AND IMAGE PROCESSING (ICSIP 2019), 2019, : 679 - 684
  • [8] Lossless Reconstruction of Convolutional Neural Network for Channel-Based Network Pruning
    Lee, Donghyeon
    Lee, Eunho
    Hwang, Youngbae
    [J]. SENSORS, 2023, 23 (04)
  • [9] Pruning Convolutional Neural Network with Distinctiveness Approach
    Li, Wenrui
    Plested, Jo
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 448 - 455
  • [10] Overview of Deep Convolutional Neural Network Pruning
    Li, Guang
    Liu, Fang
    Xia, Yuping
    [J]. 2020 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO PROCESSING AND ARTIFICIAL INTELLIGENCE, 2020, 11584