Structured pruning of recurrent neural networks through neuron selection

被引:26
|
作者
Wen, Liangjian [1 ]
Zhang, Xuanyang [1 ]
Bai, Haoli [2 ]
Xu, Zenglin [1 ,3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, SMILE Lab, Chengdu 610031, Peoples R China
[2] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Shatin, Hong Kong 999077, Peoples R China
[3] Ctr Artificial Intelligence, Peng Cheng Lab, Shenzhen, Guangdong, Peoples R China
关键词
Feature selection; Recurrent neural networks; Learning sparse models; Model compression;
D O I
10.1016/j.neunet.2019.11.018
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent neural networks (RNNs) have recently achieved remarkable successes in a number of applications. However, the huge sizes and computational burden of these models make it difficult for their deployment on edge devices. A practically effective approach is to reduce the overall storage and computation costs of RNNs by network pruning techniques. Despite their successful applications, those pruning methods based on Lasso either produce irregular sparse patterns in weight matrices, which is not helpful in practical speedup. To address these issues, we propose a structured pruning method through neuron selection which can remove the independent neuron of RNNs. More specifically, we introduce two sets of binary random variables, which can be interpreted as gates or switches to the input neurons and the hidden neurons, respectively. We demonstrate that the corresponding optimization problem can be addressed by minimizing the L-0 norm of the weight matrix. Finally, experimental results on language modeling and machine reading comprehension tasks have indicated the advantages of the proposed method in comparison with state-of-the-art pruning competitors. In particular, nearly 20x practical speedup during inference was achieved without losing performance for the language model on the Penn TreeBank dataset, indicating the promising performance of the proposed method. (C) 2019 Elsevier Ltd. All rights reserved.
引用
收藏
页码:134 / 141
页数:8
相关论文
共 50 条
  • [41] Jump-GRS: a multi-phase approach to structured pruning of neural networks for neural decoding
    Wu, Xiaomin
    Lin, Da-Ting
    Chen, Rong
    Bhattacharyya, Shuvra S.
    JOURNAL OF NEURAL ENGINEERING, 2023, 20 (04)
  • [42] Variable selection using feedforward and recurrent neural networks
    Munoz, A
    Czernichow, T
    ENGINEERING INTELLIGENT SYSTEMS FOR ELECTRICAL ENGINEERING AND COMMUNICATIONS, 1998, 6 (02): : 91 - 102
  • [43] CONSISTENT RECURRENT NEURAL NETWORKS FOR 3D NEURON SEGMENTATION
    Gonda, Felix
    Wei, Donglai
    Pfister, Hanspeter
    2021 IEEE 18TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2021, : 1012 - 1016
  • [44] Dynamical behaviors of a class of recurrent neural networks with discontinuous neuron activations
    Li, Liping
    Huang, Lihong
    APPLIED MATHEMATICAL MODELLING, 2009, 33 (12) : 4326 - 4336
  • [45] Neuron-level Structured Pruning using Polarization Regularizer
    Zhuang, Tao
    Zhang, Zhixuan
    Huang, Yuheng
    Zeng, Xiaoyi
    Shuang, Kai
    Li, Xiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [46] FPGA Resource-aware Structured Pruning for Real-Time Neural Networks
    Ramhorst, Benjamin
    Loncar, Vladimir
    Constantinides, George A.
    2023 INTERNATIONAL CONFERENCE ON FIELD PROGRAMMABLE TECHNOLOGY, ICFPT, 2023, : 282 - 283
  • [47] STRUCTURED PRUNING FOR GROUP REGULARIZED CONVOLUTIONAL NEURAL NETWORKS VIA DYNAMIC REGULARIZATION FACTOR
    Li, Feng
    Li, Bo
    Zhu, Meijiao
    Ma, Junchi
    Yuan, Jinlong
    Journal of Industrial and Management Optimization, 2025, 21 (02) : 1440 - 1455
  • [48] Optimal pruning in neural networks
    Barbato, DML
    Kinouchi, O
    PHYSICAL REVIEW E, 2000, 62 (06): : 8387 - 8394
  • [49] ON THE ROLE OF STRUCTURED PRUNING FOR NEURAL NETWORK COMPRESSION
    Bragagnolo, Andrea
    Tartaglione, Enzo
    Fiandrotti, Attilio
    Grangetto, Marco
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 3527 - 3531
  • [50] Electrokinesis hypothesis of neuron selection for synapse formation and pruning
    Esen, Ersin
    MEDICAL HYPOTHESES, 2021, 157