DANNP: an efficient artificial neural network pruning tool

被引:9
|
作者
Alshahrani, Mona [1 ]
Soufan, Othman [1 ]
Magana-Mora, Arturo [1 ,2 ]
Bajic, Vladimir B. [1 ]
机构
[1] KAUST, CBRC, Thuwal, Saudi Arabia
[2] Natl Inst Adv Ind Sci & Technol, CBBD OIL, Tokyo, Japan
来源
关键词
Artificial neural networks; Pruning; Parallelization; Feature selection; Classification problems; Machine learning; Artificial inteligence; FEATURE-EXTRACTION; FEATURE-SELECTION; INFORMATION; PREDICTION;
D O I
10.7717/peerj-cs.137
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Background. Artificial neural networks (ANNs) are a robust class of machine learning models and are a frequent choice for solving classification problems. However, determining the structure of the ANNs is not trivial as a large number of weights (connection links) may lead to overfitting the training data. Although several ANN pruning algorithms have been proposed for the simplification of ANNs, these algorithms are not able to efficiently cope with intricate ANN structures required for complex classification problems. Methods. We developed DANNP, a web-based tool, that implements parallelized versions of several ANN pruning algorithms. The DANNP tool uses a modified version of the Fast Compressed Neural Network software implemented in C++ to considerably enhance the running time of the ANN pruning algorithms we implemented. In addition to the performance evaluation of the pruned ANNs, we systematically compared the set of features that remained in the pruned ANN with those obtained by different state-of-the-art feature selection (FS) methods. Results. Although the ANN pruning algorithms are not entirely parallelizable, DANNP was able to speed up the ANN pruning up to eight times on a 32-core machine, compared to the serial implementations. To assess the impact of the ANN pruning by DANNP tool, we used 16 datasets from different domains. In eight out of the 16 datasets, DANNP significantly reduced the number of weights by 70%-99%, while maintaining a competitive or better model performance compared to the unpruned ANN. Finally, we used a naive Bayes classifier derived with the features selected as a byproduct of the ANN pruning and demonstrated that its accuracy is comparable to those obtained by the classifiers trained with the features selected by several state-of-the- art FS methods. The FS ranking methodology proposed in this study allows the users to identify the most discriminant features of the problem at hand. To the best of our knowledge, DANNP (publicly available at www.cbrc.kaust.edu.sa/dannp) is the only available and on-line accessible tool that provides multiple parallelized ANN pruning options. Datasets and DANNP code can be obtained at www.cbrc.kaust.edu.sa/dannp/data.php and https:// doi.org/10.5281/zenodo.1001086.
引用
收藏
页数:22
相关论文
共 50 条
  • [31] An artificial neural network model as a preliminary track design tool
    Montalban Domingo, Laura
    Villaronte Fernandez-Villa, Juan Antonio
    Masanet Sendra, Claudio
    Real Herraiz, Julia I.
    PROCEEDINGS OF THE INSTITUTION OF MECHANICAL ENGINEERS PART F-JOURNAL OF RAIL AND RAPID TRANSIT, 2016, 230 (04) : 1105 - 1117
  • [32] Novel Screening Tool for Stroke Using Artificial Neural Network
    Abedi, Vida
    Goyal, Nitin
    Tsivgoulis, Georgios
    Hosseinichimeh, Niyousha
    Hontecillas, Raquel
    Bassaganya-Riera, Josep
    Elijovich, Lucas
    Metter, Jeffrey E.
    Alexandrov, Anne W.
    Liebeskind, David S.
    Alexandrov, Andrei V.
    Zand, Ramin
    STROKE, 2017, 48 (06) : 1678 - +
  • [33] Seismic Image Recognition Tool via Artificial Neural Network
    Yong, Suet-Peng
    Chen, Yoke Yie
    Wan, Chin Ee
    14TH IEEE INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND INFORMATICS (CINTI), 2013, : 399 - 404
  • [34] ScoringNet: A Neural Network Based Pruning Criteria for Structured Pruning
    Wang S.
    Zhang Z.
    Scientific Programming, 2023, 2023
  • [35] Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm
    Good, Aidan
    Lin, Jiaqi
    Yu, Xin
    Sieg, Hannah
    Ferguson, Mikey
    Zhe, Shandian
    Wieczorek, Jerzy
    Serra, Thiago
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [36] Pruning by explaining: A novel criterion for deep neural network pruning
    Yeom, Seul-Ki
    Seegerer, Philipp
    Lapuschkin, Sebastian
    Binder, Alexander
    Wiedemann, Simon
    Mueller, Klaus-Robert
    Samek, Wojciech
    PATTERN RECOGNITION, 2021, 115
  • [37] PRUNING ARTIFICIAL NEURAL NETWORKS USING NEURAL COMPLEXITY MEASURES
    Jorgensen, Thomas D.
    Haynes, Barry P.
    Norlund, Charlotte C. F.
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2008, 18 (05) : 389 - 403
  • [38] Variational Convolutional Neural Network Pruning
    Zhao, Chenglong
    Ni, Bingbing
    Zhang, Jian
    Zhao, Qiwei
    Zhang, Wenjun
    Tian, Qi
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2775 - 2784
  • [39] Dirichlet Pruning for Neural Network Compression
    Adamczewski, Kamil
    Park, Mijung
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [40] Convolutional Neural Network Pruning: A Survey
    Xu, Sheng
    Huang, Anran
    Chen, Lei
    Zhang, Baochang
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7458 - 7463