An Approach to Pruning the Structure of Convolutional Neural Networks without Loss of Generalization Ability

被引:0
|
作者
Chen, Chaoxiang [1 ,2 ,3 ]
Kroshchanka, Aliaksandr [4 ]
Golovko, Vladimir [4 ,5 ]
Golovko, Olha [4 ]
机构
[1] Zhejiang Shuren Univ, Sch Informat Sci & Technol, Hangzhou 310015, Peoples R China
[2] Int Sci & Technol Cooperat Base Zhejiang Prov Remo, Hangzhou 310000, Peoples R China
[3] Zhejiang Shuren Univ, Inst Tradit Chinese Med Artificial Intelligence, Hangzhou 310015, Peoples R China
[4] Brest State Tech Univ, Brest 224017, BELARUS
[5] John Paul II Univ Biala Podlaska, PL-21500 Biala Podlaska, Poland
关键词
convolutional neural networks; convolutional restricted Boltzmann machine (CRBM); pruning of neural network parameters; pretraining of convolutional neural networks; computer vision;
D O I
10.1134/S1054661824700056
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This paper proposes an approach to pruning the parameters of convolutional neural networks using unsupervised pretraining. The authors demonstrate that the proposed approach makes it possible to reduce the number of configurable parameters of a convolutional neural network without loss of generalization ability. A comparison of the proposed approach and existing pruning techniques is made. The capabilities of the proposed algorithm are demonstrated on classical CIFAR10 and CIFAR100 computer vision samples.
引用
收藏
页码:258 / 265
页数:8
相关论文
共 50 条
  • [1] Extract Generalization Ability from Convolutional Neural Networks
    Wu, Huan
    Wu, JunMin
    Ding, Jie
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, : 729 - 734
  • [2] Loss-Driven Channel Pruning of Convolutional Neural Networks
    Long, Xin
    Zeng, Xiangrong
    Chen, Chen
    Xiao, Huaxin
    Zhang, Maojun
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2020, E103D (05) : 1190 - 1194
  • [3] Hybrid pooling for enhancement of generalization ability in deep convolutional neural networks
    Tong, Zhiqiang
    Tanaka, Gouhei
    NEUROCOMPUTING, 2019, 333 : 76 - 85
  • [4] Ensemble Approach for Improving Generalization Ability of Neural Networks
    Ahmed, Shaib
    Razib, Md. Razibul Islam
    Alam, Mohammed Shamsul
    Alam, Mohammad Shafiul
    Huda, Mohammad Nurul
    2013 INTERNATIONAL CONFERENCE ON INFORMATICS, ELECTRONICS & VISION (ICIEV), 2013,
  • [5] A Novel Ensemble Approach for Improving Generalization Ability of Neural Networks
    Lu, Lei
    Zeng, Xiaoqin
    Wu, Shengli
    Zhong, Shuiming
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2008, 2008, 5326 : 164 - +
  • [6] Trunk Pruning: Highly Compatible Channel Pruning for Convolutional Neural Networks Without Fine-Tuning
    Kim, Nam Joon
    Kim, Hyun
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 5588 - 5599
  • [7] Deeper Weight Pruning without Accuracy Loss in Deep Neural Networks
    Ahn, Byungmin
    Kim, Taewhan
    PROCEEDINGS OF THE 2020 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2020), 2020, : 73 - 78
  • [8] Iterative clustering pruning for convolutional neural networks
    Chang, Jingfei
    Lu, Yang
    Xue, Ping
    Xu, Yiqun
    Wei, Zhen
    KNOWLEDGE-BASED SYSTEMS, 2023, 265
  • [9] Leveraging Structured Pruning of Convolutional Neural Networks
    Tessier, Hugo
    Gripon, Vincent
    Leonardon, Mathieu
    Arzel, Matthieu
    Bertrand, David
    Hannagan, Thomas
    2022 IEEE WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2022, : 174 - 179
  • [10] Flattening Layer Pruning in Convolutional Neural Networks
    Jeczmionek, Ernest
    Kowalski, Piotr A.
    SYMMETRY-BASEL, 2021, 13 (07):