Rewarded Meta-Pruning: Meta Learning with Rewards for Channel Pruning

被引:0
|
作者
Shibu, Athul [1 ]
Kumar, Abhishek [1 ]
Jung, Heechul [1 ]
Lee, Dong-Gyu [1 ]
Yang, Xinsong
机构
[1] Kyungpook Natl Univ, Dept Artificial Intelligence, Daegu 41566, South Korea
关键词
convolutional neural networks; meta-pruning; ResNet-50; reward function; channel pruning;
D O I
10.3390/math11234849
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Convolutional neural networks (CNNs) have gained recognition for their remarkable performance across various tasks. However, the sheer number of parameters and the computational demands pose challenges, particularly on edge devices with limited processing power. In response to these challenges, this paper presents a novel approach aimed at enhancing the efficiency of deep learning models. Our method introduces the concept of accuracy and efficiency coefficients, offering a fine-grained control mechanism to balance the trade-off between network accuracy and computational efficiency. At our core is the Rewarded Meta-Pruning algorithm, guiding neural network training to generate pruned model weight configurations. The selection of this pruned model is based on approximations of the final model's parameters, and it is precisely controlled through a reward function. This reward function empowers us to tailor the optimization process, leading to more effective fine-tuning and improved model performance. Extensive experiments and evaluations underscore the superiority of our proposed method when compared to state-of-the-art techniques. We conducted rigorous pruning experiments on well-established architectures such as ResNet-50, MobileNetV1, and MobileNetV2. The results not only validate the efficacy of our approach but also highlight its potential to significantly advance the field of model compression and deployment on resource-constrained edge devices.
引用
收藏
页数:19
相关论文
共 50 条
  • [41] Communication Efficient Federated Learning via Channel-wise Dynamic Pruning
    Tao, Bo
    Chen, Cen
    Chen, Huimin
    2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,
  • [42] Detection of coal gangue by YOLO deep learning method based on channel pruning
    Shang, Deyong
    Lv, Zhibin
    Gao, Zehua
    Li, Yuntao
    INTERNATIONAL JOURNAL OF COAL PREPARATION AND UTILIZATION, 2025, 45 (01) : 231 - 243
  • [43] Pruning Quantized Unsupervised Meta-Learning DegradingNet Solution for Industrial Equipment and Semiconductor Process Anomaly Detection and Prediction
    Yu, Yi-Cheng
    Yang, Shiau-Ru
    Chuang, Shang-Wen
    Chien, Jen-Tzung
    Lee, Chen-Yi
    APPLIED SCIENCES-BASEL, 2024, 14 (05):
  • [44] Model optimization and acceleration method based on meta-learning and model pruning for laser vision weld tracking system
    Zou, Yanbiao
    Yang, Jianhui
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2025, 52 (02): : 195 - 203
  • [45] Model Compression Using Progressive Channel Pruning
    Guo, Jinyang
    Zhang, Weichen
    Ouyang, Wanli
    Xu, Dong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2021, 31 (03) : 1114 - 1124
  • [46] A Pruning Algorithm for Extreme Learning Machine
    Li Ying
    Li Fan-jun
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2013, 2013, 8206 : 1 - 7
  • [47] Localization -aware channel pruning for object detection
    Xie, Zihao
    Zhu, Li
    Zhao, Lin
    Tao, Bo
    Liu, Liman
    Tao, Wenbing
    NEUROCOMPUTING, 2020, 403 : 400 - 408
  • [48] Dynamic channel pruning via activation gates
    Shun-Qiang Liu
    Yan-Xia Yang
    Xue-Jin Gao
    Kun Cheng
    Applied Intelligence, 2022, 52 : 16818 - 16831
  • [49] Dynamic channel pruning via activation gates
    Liu, Shun-Qiang
    Yang, Yan-Xia
    Gao, Xue-Jin
    Cheng, Kun
    APPLIED INTELLIGENCE, 2022, 52 (14) : 16818 - 16831
  • [50] Two learning algorithms for forward pruning
    Kocsis, L
    van den Herik, HJ
    Uiterwijk, JWHM
    ICGA JOURNAL, 2003, 26 (03) : 165 - 181