Automatic Channel Pruning with Hyper-parameter Search and Dynamic Masking

被引:3
|
作者
Li, Baopu [1 ]
Fan, Yanwen [2 ]
Pan, Zhihong [1 ]
Bian, Yuchen [3 ]
Zhang, Gang [2 ]
机构
[1] Baidu USA LLC, Sunnyvale, CA 94089 USA
[2] Baidu Inc, VIS Dept, Beijing, Peoples R China
[3] Baidu Res, Beijing, Peoples R China
关键词
Model compression; Network pruning; Auto ML;
D O I
10.1145/3474085.3475370
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modern deep neural network models tend to be large and computationally intensive. One typical solution to this issue is model pruning. However, most current model pruning algorithms depend on hand crafted rules or need to input the pruning ratio beforehand. To overcome this problem, we propose a learning based automatic channel pruning algorithm for deep neural network, which is inspired by recent automatic machine learning (Auto ML). A two objectives' pruning problem that aims for the weights and the remaining channels for each layer is first formulated. An alternative optimization approach is then proposed to derive the channel numbers and weights simultaneously. In the process of pruning, we utilize a searchable hyper-parameter, remaining ratio, to denote the number of channels in each convolution layer, and then a dynamic masking process is proposed to describe the corresponding channel evolution. To adjust the trade-off between accuracy of a model and the pruning ratio of floating point operations, a new loss function is further introduced. Extensive experimental results on benchmark datasets demonstrate that our scheme achieves competitive results for neural network pruning.
引用
收藏
页码:2121 / 2129
页数:9
相关论文
共 50 条
  • [41] Hyper-Parameter Tuning for the (1+(λ, λ)) GA
    Nguyen Dang
    Doerr, Carola
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 889 - 897
  • [42] Modified Grid Searches for Hyper-Parameter Optimization
    Lopez, David
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, HAIS 2020, 2020, 12344 : 221 - 232
  • [43] Hybrid Hyper-parameter Optimization for Collaborative Filtering
    Szabo, Peter
    Genge, Bela
    2020 22ND INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2020), 2020, : 210 - 217
  • [44] Hyper-parameter Optimization Using Continuation Algorithms
    Rojas-Delgado, Jairo
    Jimenez, J. A.
    Bello, Rafael
    Lozano, J. A.
    METAHEURISTICS, MIC 2022, 2023, 13838 : 365 - 377
  • [45] Hyper-parameter Tuning under a Budget Constraint
    Lu, Zhiyun
    Chen, Liyu
    Chiang, Chao-Kai
    Sha, Fei
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5744 - 5750
  • [46] Bayesian Hyper-Parameter Optimisation for Malware Detection
    ALGorain, Fahad T.
    Clark, John A.
    ELECTRONICS, 2022, 11 (10)
  • [47] A New Baseline for Automated Hyper-Parameter Optimization
    Geitle, Marius
    Olsson, Roland
    MACHINE LEARNING, OPTIMIZATION, AND DATA SCIENCE, 2019, 11943 : 521 - 530
  • [48] Hippo: Sharing Computations in Hyper-Parameter Optimization
    Shin, Ahnjae
    Jeong, Joo Seong
    Kim, Do Yoon
    Jung, Soyoung
    Chun, Byung-Gon
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2022, 15 (05): : 1038 - 1052
  • [49] Classification complexity assessment for hyper-parameter optimization
    Cai, Ziyun
    Long, Yang
    Shao, Ling
    PATTERN RECOGNITION LETTERS, 2019, 125 : 396 - 403
  • [50] Hyper-Parameter Optimization of Stacked Asymmetric Auto-Encoders for Automatic Personality Traits Perception
    Zaferani, Effat Jalaeian
    Teshnehlab, Mohammad
    Khodadadian, Amirreza
    Heitzinger, Clemens
    Vali, Mansour
    Noii, Nima
    Wick, Thomas
    SENSORS, 2022, 22 (16)