CAPTOR: A Class Adaptive Filter Pruning Framework for Convolutional Neural Networks in Mobile Applications

被引:8
|
作者
Qin, Zhuwei [1 ]
Yu, Fuxun [1 ]
Liu, Chenchen [2 ]
Chen, Xiang [1 ]
机构
[1] George Mason Univ, Fairfax, VA 22030 USA
[2] Clarkson Univ, Potsdam, NY USA
关键词
Mobile Application; Filter Pruning; Visualization;
D O I
10.1145/3287624.3287643
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Nowadays, the evolution of deep learning and cloud service significantly promotes neural network based mobile applications. Although intelligent and prolific, those applications still lack certain flexibility: For classification tasks, neural networks are generally trained online with vast classification targets to cover various utilization contexts. However, only partial classes are practically tested due to individual mobile user preference and application specificity. Thus the unneeded classes cause considerable computation and communication cost. In this work, we propose CAPTOR - a class-level reconfiguration framework for Convolutional Neural Networks (CNNs). By identifying the class activation preference of convolutional filters through feature interest visualization and gradient analysis, CAPTOR can effectively cluster and adaptively prune the filters associated with unneeded classes. Therefore, CAPTOR enables class-level CNN reconfiguration for network model compression and local deployment on mobile devices. Experiment shows that, CAPTOR can reduce computation load for VGG-16 by up to 40.5% and 37.9% energy consumption with ignored loss of accuracy. For AlexNet, CAPTOR also reduces computation load by up to 42.8% and 37.6% energy consumption with less than 3% loss in accuracy.
引用
收藏
页码:444 / 449
页数:6
相关论文
共 50 条
  • [1] FRACTIONAL STEP DISCRIMINANT PRUNING: A FILTER PRUNING FRAMEWORK FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Gkalelis, Nikolaos
    Mezaris, Vasileios
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO WORKSHOPS (ICMEW), 2020,
  • [2] Acceleration of Deep Convolutional Neural Networks Using Adaptive Filter Pruning
    Singh, Pravendra
    Verma, Vinay Kumar
    Rai, Piyush
    Namboodiri, Vinay P.
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2020, 14 (04) : 838 - 847
  • [3] Entropy Induced Pruning Framework for Convolutional Neural Networks
    Lu, Yiheng
    Guan, Ziyu
    Yang, Yaming
    Zhao, Wei
    Gong, Maoguo
    Xu, Cai
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 4, 2024, : 3918 - 3926
  • [4] Automated Pruning of Neural Networks for Mobile Applications
    Glinserer, Andreas
    Lechner, Martin
    Wendt, Alexander
    [J]. 2021 IEEE 19TH INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN), 2021,
  • [5] Pruning convolutional neural networks via filter similarity analysis
    Lili Geng
    Baoning Niu
    [J]. Machine Learning, 2022, 111 : 3161 - 3180
  • [6] Asymptotic Soft Filter Pruning for Deep Convolutional Neural Networks
    He, Yang
    Dong, Xuanyi
    Kang, Guoliang
    Fu, Yanwei
    Yan, Chenggang
    Yang, Yi
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (08) : 3594 - 3604
  • [7] A Filter Rank Based Pruning Method for Convolutional Neural Networks
    Liu, Hao
    Guan, Zhenyu
    Lei, Peng
    [J]. 2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1318 - 1322
  • [8] Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
    He, Yang
    Kang, Guoliang
    Dong, Xuanyi
    Fu, Yanwei
    Yang, Yi
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2234 - 2240
  • [9] Pruning convolutional neural networks via filter similarity analysis
    Geng, Lili
    Niu, Baoning
    [J]. MACHINE LEARNING, 2022, 111 (09) : 3161 - 3180
  • [10] Filter pruning for convolutional neural networks in semantic image segmentation
    Lopez-Gonzalez, Clara I.
    Gasco, Esther
    Barrientos-Espillco, Fredy
    Besada-Portas, Eva
    Pajares, Gonzalo
    [J]. NEURAL NETWORKS, 2024, 169 : 713 - 732