Feature flow regularization: Improving structured sparsity in deep neural networks

被引:6
|
作者
Wu, Yue [1 ]
Lan, Yuan [1 ]
Zhang, Luchan [2 ]
Xiang, Yang [1 ,3 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Math, Kowloon, Clear Water Bay, Hong Kong, Peoples R China
[2] Shenzhen Univ, Coll Math & Stat, Shenzhen 518060, Peoples R China
[3] HKUST Shenzhen Hong Kong Collaborat Innovat Res In, Algorithms Machine Learning & Autonomous Driving R, Shenzhen, Peoples R China
关键词
Deep neural networks; Structured pruning; Image classification; Regularization;
D O I
10.1016/j.neunet.2023.02.013
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pruning is a model compression method that removes redundant parameters and accelerates the inference speed of deep neural networks (DNNs) while maintaining accuracy. Most available pruning methods impose various conditions on parameters or features directly. In this paper, we propose a simple and effective regularization strategy to improve the structured sparsity and structured pruning in DNNs from a new perspective of evolution of features. In particular, we consider the trajectories connecting features of adjacent hidden layers, namely feature flow. We propose feature flow regularization (FFR) to penalize the length and the total absolute curvature of the trajectories, which implicitly increases the structured sparsity of the parameters. The principle behind FFR is that short and straight trajectories will lead to an efficient network that avoids redundant parameters. Experiments on CIFAR-10 and ImageNet datasets show that FFR improves structured sparsity and achieves pruning results comparable to or even better than those state-of-the-art methods. (c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页码:598 / 613
页数:16
相关论文
共 50 条
  • [21] Structured feature sparsity training for convolutional neural network compression
    Wang, Wei
    Zhu, Liqiang
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2020, 71
  • [22] Improving Regularization in Deep Neural Networks by Co-adaptation Trace Detection
    Moayed, Hojjat
    Mansoori, Eghbal G.
    NEURAL PROCESSING LETTERS, 2023, 55 (06) : 7985 - 7997
  • [23] Improving Regularization in Deep Neural Networks by Co-adaptation Trace Detection
    Hojjat Moayed
    Eghbal G. Mansoori
    Neural Processing Letters, 2023, 55 : 7985 - 7997
  • [24] Threshout Regularization for Deep Neural Networks
    Williams, Travis
    Li, Robert
    SOUTHEASTCON 2021, 2021, : 728 - 735
  • [25] Towards l1 Regularization for Deep Neural Networks: Model Sparsity Versus Task Difficulty
    Shen, Ta-Chun
    Yang, Chun-Pai
    Yen, Ian En-Hsu
    Lin, Shou-De
    2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 126 - 134
  • [26] Feature-Flow Interpretation of Deep Convolutional Neural Networks
    Cui, Xinrui
    Wang, Dan
    Wang, Z. Jane
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (07) : 1847 - 1861
  • [27] Structured Sparsity with Group-Graph Regularization
    Dai, Xin-Yu
    Zhang, Jian-Bing
    Huang, Shu-Jian
    Chen, Jia-Jun
    Zhou, Zhi-Hua
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 1714 - 1720
  • [28] Solving Structured Sparsity Regularization with Proximal Methods
    Mosci, Sofia
    Rosasco, Lorenzo
    Santoro, Matteo
    Verri, Alessandro
    Villa, Silvia
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT II: EUROPEAN CONFERENCE, ECML PKDD 2010, 2010, 6322 : 418 - 433
  • [29] Heterogeneous Representation Learning with Structured Sparsity Regularization
    Yang, Pei
    He, Jingrui
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 539 - 548
  • [30] Weakly decomposable regularization penalties and structured sparsity
    van de Geer, Sara
    SCANDINAVIAN JOURNAL OF STATISTICS, 2014, 41 (01) : 72 - 86