DANNP: an efficient artificial neural network pruning tool

被引:9
|
作者
Alshahrani, Mona [1 ]
Soufan, Othman [1 ]
Magana-Mora, Arturo [1 ,2 ]
Bajic, Vladimir B. [1 ]
机构
[1] KAUST, CBRC, Thuwal, Saudi Arabia
[2] Natl Inst Adv Ind Sci & Technol, CBBD OIL, Tokyo, Japan
来源
关键词
Artificial neural networks; Pruning; Parallelization; Feature selection; Classification problems; Machine learning; Artificial inteligence; FEATURE-EXTRACTION; FEATURE-SELECTION; INFORMATION; PREDICTION;
D O I
10.7717/peerj-cs.137
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Background. Artificial neural networks (ANNs) are a robust class of machine learning models and are a frequent choice for solving classification problems. However, determining the structure of the ANNs is not trivial as a large number of weights (connection links) may lead to overfitting the training data. Although several ANN pruning algorithms have been proposed for the simplification of ANNs, these algorithms are not able to efficiently cope with intricate ANN structures required for complex classification problems. Methods. We developed DANNP, a web-based tool, that implements parallelized versions of several ANN pruning algorithms. The DANNP tool uses a modified version of the Fast Compressed Neural Network software implemented in C++ to considerably enhance the running time of the ANN pruning algorithms we implemented. In addition to the performance evaluation of the pruned ANNs, we systematically compared the set of features that remained in the pruned ANN with those obtained by different state-of-the-art feature selection (FS) methods. Results. Although the ANN pruning algorithms are not entirely parallelizable, DANNP was able to speed up the ANN pruning up to eight times on a 32-core machine, compared to the serial implementations. To assess the impact of the ANN pruning by DANNP tool, we used 16 datasets from different domains. In eight out of the 16 datasets, DANNP significantly reduced the number of weights by 70%-99%, while maintaining a competitive or better model performance compared to the unpruned ANN. Finally, we used a naive Bayes classifier derived with the features selected as a byproduct of the ANN pruning and demonstrated that its accuracy is comparable to those obtained by the classifiers trained with the features selected by several state-of-the- art FS methods. The FS ranking methodology proposed in this study allows the users to identify the most discriminant features of the problem at hand. To the best of our knowledge, DANNP (publicly available at www.cbrc.kaust.edu.sa/dannp) is the only available and on-line accessible tool that provides multiple parallelized ANN pruning options. Datasets and DANNP code can be obtained at www.cbrc.kaust.edu.sa/dannp/data.php and https:// doi.org/10.5281/zenodo.1001086.
引用
收藏
页数:22
相关论文
共 50 条
  • [41] Importance Estimation for Neural Network Pruning
    Molchanov, Pavlo
    Mallya, Arun
    Tyree, Stephen
    Frosio, Iuri
    Kautz, Jan
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 11256 - 11264
  • [42] Neural network pruning for function approximation
    Setiono, R
    Gaweda, A
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL VI, 2000, : 443 - 448
  • [43] Neural network pruning and hardware acceleration
    Jeong, Taehee
    Ghasemi, Ehsam
    Tuyls, Jorn
    Delaye, Elliott
    Sirasao, Ashish
    2020 IEEE/ACM 13TH INTERNATIONAL CONFERENCE ON UTILITY AND CLOUD COMPUTING (UCC 2020), 2020, : 440 - 445
  • [44] Information geometry on pruning of neural network
    Liu, YH
    Luo, SW
    Li, AJ
    Yu, HB
    PROCEEDINGS OF THE 2004 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2004, : 3479 - 3483
  • [45] A Probabilistic Approach to Neural Network Pruning
    Qian, Xin
    Klabjan, Diego
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [46] Adversarial Structured Neural Network Pruning
    Cai, Xingyu
    Yi, Jinfeng
    Zhang, Fan
    Rajasekaran, Sanguthevar
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2433 - 2436
  • [47] Measurement criteria for neural network pruning
    Erdogan, SS
    Ng, GS
    Patrick, KHC
    1996 IEEE TENCON - DIGITAL SIGNAL PROCESSING APPLICATIONS PROCEEDINGS, VOLS 1 AND 2, 1996, : 83 - 89
  • [48] An Efficient Approach to Iterative Network Pruning
    Huang, Chuan-Shun
    Tang, Wuqian
    Chen, Yung-Chih
    Li, Yi-Ting
    Chang, Shih-Chieh
    Wang, Chun-Yao
    2024 INTERNATIONAL VLSI SYMPOSIUM ON TECHNOLOGY, SYSTEMS AND APPLICATIONS, VLSI TSA, 2024,
  • [49] EFFICIENT IMAGE SUPER RESOLUTION VIA CHANNEL DISCRIMINATIVE DEEP NEURAL NETWORK PRUNING
    Hou, Zejiang
    Kung, Sun-Yuan
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3647 - 3651
  • [50] FLOPs-efficient filter pruning via transfer scale for neural network acceleration
    Guo, Zhixin
    Xiao, Yifan
    Liao, Wenzhi
    Veelaert, Peter
    Philips, Wilfried
    JOURNAL OF COMPUTATIONAL SCIENCE, 2021, 55