Pruning filters with L1-norm and capped L1-norm for CNN compression

被引:0
|
作者
Kumar, Aakash [1 ]
Shaikh, Ali Muhammad [1 ]
Li, Yun [1 ]
Bilal, Hazrat [1 ]
Yin, Baoqun [1 ]
机构
[1] Univ Sci & Technol China, Dept Automat, Hefei 230026, Peoples R China
关键词
Filter pruning; Capped L1-norm; VGGnet; CIFAR; Convolutional neural network; FLOPs;
D O I
10.1007/s10489-020-01894-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The blistering progress of convolutional neural networks (CNNs) in numerous applications of the real-world usually obstruct by a surge in network volume and computational cost. Recently, researchers concentrate on eliminating these issues by compressing the CNN models, such as pruning filters and weights. In comparison with the technique of pruning weights, the technique of pruning filters doesn't effect in sparse connectivity patterns. In this article, we have proposed a fresh new technique to estimate the significance of filters. More precisely, we combined L1-norm with capped L1-norm to represent the amount of information extracted by the filter and control regularization. In the process of pruning, the insignificant filters remove directly without any loss in the test accuracy, providing much slimmer and compact models with comparable accuracy and this process is iterated a few times. To validate the effectiveness of our algorithm. We experimentally determine the usefulness of our approach with several advanced CNN models on numerous standard data sets. Particularly, data sets CIFAR-10 is used on VGG-16 and prunes 92.7% parameters with float-point-operations (FLOPs) reduction of 75.8% without loss of accuracy and has achieved advancement in state-of-art.
引用
收藏
页码:1152 / 1160
页数:9
相关论文
共 50 条
  • [1] Pruning filters with L1-norm and capped L1-norm for CNN compression
    Aakash Kumar
    Ali Muhammad Shaikh
    Yun Li
    Hazrat Bilal
    Baoqun Yin
    [J]. Applied Intelligence, 2021, 51 : 1152 - 1160
  • [2] Pruning Filters With L1-norm And Standard Deviation for CNN Compression
    Sun, Xinlu
    Zhou, Dianle
    Pan, Xiaotian
    Zhong, Zhiwei
    Wang, Fei
    [J]. ELEVENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2018), 2019, 11041
  • [3] Robust Dictionary Learning with Capped l1-Norm
    Jiang, Wenhao
    Nie, Feiping
    Huang, Heng
    [J]. PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3590 - 3596
  • [4] Notes on quantum coherence with l1-norm and convex-roof l1-norm
    Zhu, Jiayao
    Ma, Jian
    Zhang, Tinggui
    [J]. QUANTUM INFORMATION PROCESSING, 2021, 20 (12)
  • [5] Supporting vectors for the l1-norm and the l∞-norm and an application
    Sanchez-Alzola, Alberto
    Garcia-Pacheco, Francisco Javier
    Naranjo-Guerra, Enrique
    Moreno-Pulido, Soledad
    [J]. MATHEMATICAL SCIENCES, 2021, 15 (02) : 173 - 187
  • [6] Capped L1-Norm Proximal Support Vector Machine
    Ren, Pei-Wei
    Li, Chun-Na
    Shao, Yuan-Hai
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [7] Linearized alternating directions method for l1-norm inequality constrained l1-norm minimization
    Cao, Shuhan
    Xiao, Yunhai
    Zhu, Hong
    [J]. APPLIED NUMERICAL MATHEMATICS, 2014, 85 : 142 - 153
  • [8] L1-norm quantile regression
    Li, Youjuan
    Zhu, Ji
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2008, 17 (01) : 163 - 185
  • [9] MRPP TESTS IN L1-NORM
    TRACY, DS
    KHAN, KA
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 1987, 5 (04) : 373 - 380
  • [10] MONOSPLINES OF MINIMAL L1-NORM
    ZHENSYKBAEV, AA
    [J]. MATHEMATICAL NOTES, 1983, 33 (5-6) : 443 - 452