Compression of deep neural networks: bridging the gap between conventional-based pruning and evolutionary approach

被引:0
|
作者
Zhang, Yidan [1 ]
Wang, Guangjin [2 ]
Yang, Taibo [2 ]
Pang, Tianfeng [2 ]
He, Zhenan [1 ]
Lv, Jiancheng [1 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Chengdu 610065, Peoples R China
[2] Nucl Power Inst China, Reactor Engn Res Sub Inst, Chengdu, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2022年 / 34卷 / 19期
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Deep neural networks; Evolutionary algorithm; Filter pruning; Multiobjective optimization; ALGORITHM;
D O I
10.1007/s00521-022-07161-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, many studies have been carried out on model compression to handle the high computational cost and high memory footprint brought by the implementation of deep neural networks. In this paper, model compression of convolutional neural networks is constructed as a multiobjective optimization problem with two conflicting objectives, reducing the model size and improving the performance. A novel structured pruning method called Conventional-based and Evolutionary Approaches Guided Multiobjective Pruning (CEA-MOP) is proposed to address this problem, where the power of conventional pruning methods is effectively exploited for the evolutionary process. A delicate balance in pruning rate and model accuracy has been automated achieved by a multiobjective optimization evolutionary model. First, an ensemble framework integrates pruning metrics to establish a codebook for further evolutionary operations. Then, an efficient coding method is developed to shorten the length of chromosome, thus ensuring its superior scalability. Finally, sensitivity analysis is automatically carried out to determine the upper bound of pruning rate for each layer. Notably, on CIFAR-10, CEA-MOP reduces more than 50% FLOPs on ResNet-110 and improves the relative accuracy. Moreover, on ImageNet, CEA-MOP reduces more than 50% FLOPs on ResNet-101 with negligible top-1 accuracy drop.
引用
收藏
页码:16493 / 16514
页数:22
相关论文
共 50 条
  • [1] Compression of deep neural networks: bridging the gap between conventional-based pruning and evolutionary approach
    Zhang, Yidan
    Wang, Guangjin
    Yang, Taibo
    Pang, Tianfeng
    He, Zhenan
    Lv, Jiancheng
    [J]. Neural Computing and Applications, 2022, 34 (19) : 16493 - 16514
  • [2] Compression of deep neural networks: bridging the gap between conventional-based pruning and evolutionary approach
    Yidan Zhang
    Guangjin Wang
    Taibo Yang
    Tianfeng Pang
    Zhenan He
    Jiancheng Lv
    [J]. Neural Computing and Applications, 2022, 34 : 16493 - 16514
  • [3] LightNN: Filling the Gap between Conventional Deep Neural Networks and Binarized Networks
    Ding, Ruizhou
    Liu, Zeye
    Shi, Rongye
    Marculescu, Diana
    Blanton, R. D.
    [J]. PROCEEDINGS OF THE GREAT LAKES SYMPOSIUM ON VLSI 2017 (GLSVLSI' 17), 2017, : 35 - 40
  • [4] EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks
    Poyatos, Javier
    Molina, Daniel
    Martinez, Aritz D.
    Del Ser, Javier
    Herrera, Francisco
    [J]. NEURAL NETWORKS, 2023, 158 : 59 - 82
  • [5] Deep neural networks compression learning based on multiobjective evolutionary algorithms
    Huang, Junhao
    Sun, Weize
    Huang, Lei
    [J]. NEUROCOMPUTING, 2020, 378 : 260 - 269
  • [6] Compression of Deep Neural Networks by combining pruning and low rank decomposition
    Goyal, Saurabh
    Choudhury, Anamitra Roy
    Sharma, Vivek
    [J]. 2019 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW), 2019, : 952 - 958
  • [7] Compression of Deep Convolutional Neural Networks Using Effective Channel Pruning
    Guo, Qingbei
    Wu, Xiao-Jun
    Zhao, Xiuyang
    [J]. IMAGE AND GRAPHICS, ICIG 2019, PT I, 2019, 11901 : 760 - 772
  • [8] DEEP LEARNING BASED METHOD FOR PRUNING DEEP NEURAL NETWORKS
    Li, Lianqiang
    Zhu, Jie
    Sun, Ming-Ting
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2019, : 312 - 317
  • [9] The indirect evolutionary approach: Bridging the gap between rationality and adaptation
    Guth, W
    Kliemt, K
    [J]. RATIONALITY AND SOCIETY, 1998, 10 (03) : 377 - 399
  • [10] Evolutionary Compression of Deep Neural Networks for Biomedical Image Segmentation
    Zhou, Yao
    Yen, Gary G.
    Yi, Zhang
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (08) : 2916 - 2929