Rewarded Meta-Pruning: Meta Learning with Rewards for Channel Pruning

被引:0
|
作者
Shibu, Athul [1 ]
Kumar, Abhishek [1 ]
Jung, Heechul [1 ]
Lee, Dong-Gyu [1 ]
Yang, Xinsong
机构
[1] Kyungpook Natl Univ, Dept Artificial Intelligence, Daegu 41566, South Korea
关键词
convolutional neural networks; meta-pruning; ResNet-50; reward function; channel pruning;
D O I
10.3390/math11234849
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
Convolutional neural networks (CNNs) have gained recognition for their remarkable performance across various tasks. However, the sheer number of parameters and the computational demands pose challenges, particularly on edge devices with limited processing power. In response to these challenges, this paper presents a novel approach aimed at enhancing the efficiency of deep learning models. Our method introduces the concept of accuracy and efficiency coefficients, offering a fine-grained control mechanism to balance the trade-off between network accuracy and computational efficiency. At our core is the Rewarded Meta-Pruning algorithm, guiding neural network training to generate pruned model weight configurations. The selection of this pruned model is based on approximations of the final model's parameters, and it is precisely controlled through a reward function. This reward function empowers us to tailor the optimization process, leading to more effective fine-tuning and improved model performance. Extensive experiments and evaluations underscore the superiority of our proposed method when compared to state-of-the-art techniques. We conducted rigorous pruning experiments on well-established architectures such as ResNet-50, MobileNetV1, and MobileNetV2. The results not only validate the efficacy of our approach but also highlight its potential to significantly advance the field of model compression and deployment on resource-constrained edge devices.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] Channel Pruning of Transfer Learning Models Using Novel Techniques
    Thaker, Pragnesh
    Mohan, Biju R.
    IEEE ACCESS, 2024, 12 : 94914 - 94925
  • [22] Channel pruning guided by global channel relation
    Cheng, Yingjie
    Wang, Xiaoqi
    Xie, Xiaolan
    Li, Wentao
    Peng, Shaoliang
    APPLIED INTELLIGENCE, 2022, 52 (14) : 16202 - 16213
  • [23] Channel pruning guided by global channel relation
    Yingjie Cheng
    Xiaoqi Wang
    Xiaolan Xie
    Wentao Li
    Shaoliang Peng
    Applied Intelligence, 2022, 52 : 16202 - 16213
  • [24] Channel Pruning Method for Signal Modulation Recognition Deep Learning Models
    Chen, Zhuangzhi
    Wang, Zhangwei
    Gao, Xuzhang
    Zhou, Jinchao
    Xu, Dongwei
    Zheng, Shilian
    Xuan, Qi
    Yang, Xiaoniu
    IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2024, 10 (02) : 442 - 453
  • [25] Pruning algorithms for rule learning
    Furnkranz, J
    MACHINE LEARNING, 1997, 27 (02) : 139 - 171
  • [26] A tight integration of pruning and learning
    Furnkranz, J
    MACHINE LEARNING: ECML-95, 1995, 912 : 291 - 294
  • [27] Learning Compact Networks via Similarity-aware Channel Pruning
    Zhang, Quan
    Shi, Yemin
    Zhang, Lechun
    Wang, Yaowei
    Tian, Yonghong
    THIRD INTERNATIONAL CONFERENCE ON MULTIMEDIA INFORMATION PROCESSING AND RETRIEVAL (MIPR 2020), 2020, : 149 - 152
  • [28] Identification of plant leaf diseases by deep learning based on channel attention and channel pruning
    Chen, Riyao
    Qi, Haixia
    Liang, Yu
    Yang, Mingchao
    FRONTIERS IN PLANT SCIENCE, 2022, 13
  • [29] Pruning Algorithms for Rule Learning
    Johannes Fürnkranz
    Machine Learning, 1997, 27 : 139 - 172
  • [30] Exploiting Channel Similarity for Network Pruning
    Zhao, Chenglong
    Zhang, Yunxiang
    Ni, Bingbing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (09) : 5049 - 5061