Filter Pruning with Convolutional Approximation Small Model Framework

被引:0
|
作者
Intraraprasit, Monthon [1 ]
Chitsobhuk, Orachat [1 ]
机构
[1] King Mongkuts Inst Technol Ladkrabang, Sch Engn, Chalongkrung Rd, Bangkok 10520, Thailand
关键词
filter pruning framework; convolutional neural networks; deep learning; model compression;
D O I
10.3390/computation11090176
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Convolutional neural networks (CNNs) are extensively utilized in computer vision; however, they pose challenges in terms of computational time and storage requirements. To address this issue, one well-known approach is filter pruning. However, fine-tuning pruned models necessitates substantial computing power and a large retraining dataset. To restore model performance after pruning each layer, we propose the Convolutional Approximation Small Model (CASM) framework. CASM involves training a compact model with the remaining kernels and optimizing their weights to restore feature maps that resemble the original kernels. This method requires less complexity and fewer training samples compared to basic fine-tuning. We evaluate the performance of CASM on the CIFAR-10 and ImageNet datasets using VGG-16 and ResNet-50 models. The experimental results demonstrate that CASM surpasses the basic fine-tuning framework in terms of time acceleration (3.3x faster), requiring a smaller dataset for performance recovery after pruning, and achieving enhanced accuracy.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] FRACTIONAL STEP DISCRIMINANT PRUNING: A FILTER PRUNING FRAMEWORK FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (ICMEW), 2020,
  • [2] CAPTOR: A Class Adaptive Filter Pruning Framework for Convolutional Neural Networks in Mobile Applications
    Qin, Zhuwei
    Yu, Fuxun
    Liu, Chenchen
    Chen, Xiang
    [J]. 24TH ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE (ASP-DAC 2019), 2019, : 444 - 449
  • [3] A Simple and Effective Convolutional Filter Pruning based on Filter Dissimilarity Analysis
    Erick, F. X.
    Sawant, Shrutika S.
    Goeb, Stephan
    Holzer, N.
    Lang, E. W.
    Goetz, Th
    [J]. ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 139 - 145
  • [4] Convolutional Neural Network Pruning Using Filter Attenuation
    Mousa-Pasandi, Morteza
    Hajabdollahi, Mohsen
    Karimi, Nader
    Samavi, Shadrokh
    Shirani, Shahram
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 2905 - 2909
  • [5] Filter Contribution Recycle: Boosting Model Pruning with Small Norm Filters
    Chen, Zehong
    Xie, Zhonghua
    Wang, Zhen
    Xu, Tao
    Zhang, Zhengrui
    [J]. KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2022, 16 (11): : 3507 - 3522
  • [6] Entropy Induced Pruning Framework for Convolutional Neural Networks
    Lu, Yiheng
    Guan, Ziyu
    Yang, Yaming
    Zhao, Wei
    Gong, Maoguo
    Xu, Cai
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 4, 2024, : 3918 - 3926
  • [7] Improve Convolutional Neural Network Pruning by Maximizing Filter Variety
    Hubens, Nathan
    Mancas, Matei
    Gosselin, Bernard
    Preda, Marius
    Zaharia, Titus
    [J]. IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT I, 2022, 13231 : 379 - 390
  • [8] Pruning convolutional neural networks via filter similarity analysis
    Lili Geng
    Baoning Niu
    [J]. Machine Learning, 2022, 111 : 3161 - 3180
  • [9] Accelerating Convolutional Networks via Global & Dynamic Filter Pruning
    Lin, Shaohui
    Ji, Rongrong
    Li, Yuchao
    Wu, Yongjian
    Huang, Feiyue
    Zhang, Baochang
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2425 - 2432
  • [10] Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks
    He, Yang
    Dong, Xuanyi
    Kang, Guoliang
    Fu, Yanwei
    Yan, Chenggang
    Yang, Yi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3594 - 3604