A Smoothed LASSO-Based DNN Sparsification Technique

被引:3
|
作者
Koneru, Basava Naga Girish [1 ]
Chandrachoodan, Nitin [1 ]
Vasudevan, Vinita [1 ]
机构
[1] IIT Madras, Dept Elect Engn, Chennai 600036, Tamil Nadu, India
关键词
Smoothing methods; Neurons; Training; Approximation algorithms; Sensitivity; Convergence; Cost function; Deep neural networks; LASSO; smoothing functions; sparsity; structured LASSO; L-1/2; REGULARIZATION; INPUT LAYER; ALGORITHMS;
D O I
10.1109/TCSI.2021.3097765
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep Neural Networks (DNNs) are increasingly being used in a variety of applications. However, DNNs have huge computational and memory requirements. One way to reduce these requirements is to sparsify DNNs by using smoothed LASSO (Least Absolute Shrinkage and Selection Operator) functions. In this paper, we show that irrespective of error profile, the sparsity values obtained using various smoothed LASSO functions are similar, provided the maximum error of these functions with respect to the LASSO function is the same. We also propose a layer-wise DNN pruning algorithm, where the layers are pruned based on their individual allocated accuracy loss budget, determined by estimates of the reduction in number of multiply-accumulate operations (in convolutional layers) and weights (in fully connected layers). Further, the structured LASSO variants in both convolutional and fully connected layers are explored within the smoothed LASSO framework and the tradeoffs involved are discussed. The efficacy of proposed algorithm in enhancing the sparsity within the allowed degradation in DNN accuracy and results obtained on structured LASSO variants are shown on MNIST, SVHN, CIFAR-10, and Imagenette datasets and on larger networks such as ResNet-50 and Mobilenet.
引用
收藏
页码:4287 / 4298
页数:12
相关论文
共 50 条
  • [1] A Lasso-based Collaborative Filtering Recommendation Model
    Hiep Xuan Huynh
    Vien Quang Dam
    Long Van Nguyen
    Nghia Quoc Phan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (04) : 509 - 514
  • [2] LASSO-based multivariate linear profile monitoring
    Changliang Zou
    Xianghui Ning
    Fugee Tsung
    Annals of Operations Research, 2012, 192 : 3 - 19
  • [3] LASSO-based multivariate linear profile monitoring
    Zou, Changliang
    Ning, Xianghui
    Tsung, Fugee
    ANNALS OF OPERATIONS RESEARCH, 2012, 192 (01) : 3 - 19
  • [4] LASSO-based node adaptive refinement in trajectory optimization
    Zhang S.
    Hou M.-S.
    Xi Tong Cheng Yu Dian Zi Ji Shu/Syst Eng Electron, 5 (1195-1200): : 1195 - 1200
  • [5] Solar Power Generation Forecasting With a LASSO-Based Approach
    Tang, Ningkai
    Mao, Shiwen
    Wang, Yu
    Nelms, R. M.
    IEEE INTERNET OF THINGS JOURNAL, 2018, 5 (02): : 1090 - 1099
  • [6] LASSO-BASED REVERBERATION SUPPRESSION IN AUTOMATIC SPEECH RECOGNITION
    Zhang, Xuewei
    Lin, Yiye
    Wang, Dong
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 5034 - 5037
  • [7] LASSO-based screening for potential prognostic biomarkers associated with glioblastoma
    Tian, Yin
    Chen, Li'e
    Jiang, Yun
    FRONTIERS IN ONCOLOGY, 2023, 12
  • [8] Exclusive lasso-based k-nearest-neighbor classification
    Lin Qiu
    Yanpeng Qu
    Changjing Shang
    Longzhi Yang
    Fei Chao
    Qiang Shen
    Neural Computing and Applications, 2021, 33 : 14247 - 14261
  • [9] Vaccine design via nonnegative lasso-based variable selection
    Hu, Zonghui
    Follmann, Dean A.
    Miura, Kazutoyo
    STATISTICS IN MEDICINE, 2015, 34 (10) : 1791 - 1798
  • [10] Exclusive lasso-based k-nearest-neighbor classification
    Qiu, Lin
    Qu, Yanpeng
    Shang, Changjing
    Yang, Longzhi
    Chao, Fei
    Shen, Qiang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (21): : 14247 - 14261