Adaptive Channel Pruning for Trainability Protection

被引:0
|
作者
Liu, Jiaxin [1 ,2 ]
Zhang, Dazong [4 ]
Liu, Wei [1 ,2 ]
Li, Yongming [3 ]
Hu, Jun [2 ]
Cheng, Shuai [2 ]
Yang, Wenxing [1 ,2 ]
机构
[1] Northeastern Univ, Sch Comp Sci & Engn, Shenyang 110167, Liaoning, Peoples R China
[2] Neusoft Reach Automot Technol Co, Shenyang 110179, Liaoning, Peoples R China
[3] Liaoning Univ Technol, Coll Sci, Liaoing 121001, Peoples R China
[4] BYD Auto Ind Co Ltd, Shenzhen 518118, Peoples R China
基金
中国国家自然科学基金;
关键词
Convolutional neural networks; Trainability preservation; Model compression; Pruning;
D O I
10.1007/978-981-99-8549-4_12
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pruning is a widely used method for compressing neural networks, reducing their computational requirements by removing unimportant connections. However, many existing pruning methods prune pre-trained models by using the same pruning rate for each layer, neglecting the protection of model trainability and damaging accuracy. Additionally, the number of redundant parameters per layer in complex models varies, necessitating adjustment of the pruning rate according to model structure and training data. To overcome these issues, we propose a trainability-preserving adaptive channel pruning method that prunes during training. Our approach utilizes a model weight-based similarity calculation module to eliminate unnecessary channels while protecting model trainability and correcting output feature maps. An adaptive sparsity control module assigns pruning rates for each layer according to a preset target and aids network training. We performed experiments on CIFAR-10 and Imagenet classification datasets using networks of various structures. Our technique outperformed comparison methods at different pruning rates. Additionally, we confirmed the effectiveness of our technique on the object detection datasets VOC and COCO.
引用
收藏
页码:137 / 148
页数:12
相关论文
共 50 条
  • [1] Domain Adaptive Channel Pruning
    Yang, Ge
    Zhang, Chao
    Gao, Ling
    Guo, Yufei
    Guo, Jinyang
    ELECTRONICS, 2024, 13 (05)
  • [2] ACP: ADAPTIVE CHANNEL PRUNING FOR EFFICIENT NEURAL NETWORKS
    Zhang, Yuan
    Yuan, Yuan
    Wang, Qi
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4488 - 4492
  • [3] AdaPruner: Adaptive Channel Pruning and Effective Weights Inheritance
    Liu X.
    Cao J.
    Yao H.
    Xu P.
    Zhang Y.
    Wang Y.
    Beijing Daxue Xuebao (Ziran Kexue Ban)/Acta Scientiarum Naturalium Universitatis Pekinensis, 2023, 59 (05): : 764 - 772
  • [4] A Lightweight Network With Adaptive Input and Adaptive Channel Pruning Strategy for Bearing Fault Diagnosis
    Liu, Lei
    Cheng, Yao
    Song, Dongli
    Zhang, Weihua
    Tang, Guiting
    Luo, Yaping
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 11
  • [5] ADAPTIVE CONTROL - TRAINABILITY ADDS A NEW DIMENSION
    FEINBERG, B
    MANUFACTURING ENGINEERING & MANAGEMENT, 1971, 67 (06): : 18 - &
  • [6] A generality hard channel pruning with adaptive compression rate selection for HRNet
    Liu, Dongjingdian
    Gao, Shouwan
    Chen, Pengpeng
    Cheng, Lei
    PATTERN RECOGNITION LETTERS, 2023, 168 : 107 - 114
  • [7] Connection pruning with static and adaptive pruning schedules
    Prechelt, L
    NEUROCOMPUTING, 1997, 16 (01) : 49 - 61
  • [8] ADAPTIVE CHANNEL ERROR PROTECTION OF SUBBAND ENCODED IMAGES
    WESTERINK, PH
    WEBER, JH
    BOEKEE, DE
    LIMPERS, JW
    IEEE TRANSACTIONS ON COMMUNICATIONS, 1993, 41 (03) : 454 - 459
  • [9] Tailored Channel Pruning: Achieve Targeted Model Complexity Through Adaptive Sparsity Regularization
    Lee, Suwoong
    Jeon, Yunho
    Lee, Seungjae
    Kim, Junmo
    IEEE ACCESS, 2025, 13 : 12113 - 12126
  • [10] lAKECP: Adaptive Knowledge Extraction from Feature Maps for Fast and Efficient Channel Pruning
    Zhang, Haonan
    Liu, Longjun
    Zhou, Hengyi
    Hou, Wenxuan
    Sun, Hongbin
    Zheng, Nanning
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 648 - 657