DANNP: an efficient artificial neural network pruning tool

被引:9
|
作者
Alshahrani, Mona [1 ]
Soufan, Othman [1 ]
Magana-Mora, Arturo [1 ,2 ]
Bajic, Vladimir B. [1 ]
机构
[1] KAUST, CBRC, Thuwal, Saudi Arabia
[2] Natl Inst Adv Ind Sci & Technol, CBBD OIL, Tokyo, Japan
来源
关键词
Artificial neural networks; Pruning; Parallelization; Feature selection; Classification problems; Machine learning; Artificial inteligence; FEATURE-EXTRACTION; FEATURE-SELECTION; INFORMATION; PREDICTION;
D O I
10.7717/peerj-cs.137
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Background. Artificial neural networks (ANNs) are a robust class of machine learning models and are a frequent choice for solving classification problems. However, determining the structure of the ANNs is not trivial as a large number of weights (connection links) may lead to overfitting the training data. Although several ANN pruning algorithms have been proposed for the simplification of ANNs, these algorithms are not able to efficiently cope with intricate ANN structures required for complex classification problems. Methods. We developed DANNP, a web-based tool, that implements parallelized versions of several ANN pruning algorithms. The DANNP tool uses a modified version of the Fast Compressed Neural Network software implemented in C++ to considerably enhance the running time of the ANN pruning algorithms we implemented. In addition to the performance evaluation of the pruned ANNs, we systematically compared the set of features that remained in the pruned ANN with those obtained by different state-of-the-art feature selection (FS) methods. Results. Although the ANN pruning algorithms are not entirely parallelizable, DANNP was able to speed up the ANN pruning up to eight times on a 32-core machine, compared to the serial implementations. To assess the impact of the ANN pruning by DANNP tool, we used 16 datasets from different domains. In eight out of the 16 datasets, DANNP significantly reduced the number of weights by 70%-99%, while maintaining a competitive or better model performance compared to the unpruned ANN. Finally, we used a naive Bayes classifier derived with the features selected as a byproduct of the ANN pruning and demonstrated that its accuracy is comparable to those obtained by the classifiers trained with the features selected by several state-of-the- art FS methods. The FS ranking methodology proposed in this study allows the users to identify the most discriminant features of the problem at hand. To the best of our knowledge, DANNP (publicly available at www.cbrc.kaust.edu.sa/dannp) is the only available and on-line accessible tool that provides multiple parallelized ANN pruning options. Datasets and DANNP code can be obtained at www.cbrc.kaust.edu.sa/dannp/data.php and https:// doi.org/10.5281/zenodo.1001086.
引用
收藏
页数:22
相关论文
共 50 条
  • [21] Power-efficient neural network with artificial dendrites
    Li, Xinyi
    Tang, Jianshi
    Zhang, Qingtian
    Gao, Bin
    Yang, J. Joshua
    Song, Sen
    Wu, Wei
    Zhang, Wenqiang
    Yao, Peng
    Deng, Ning
    Deng, Lei
    Xie, Yuan
    Qian, He
    Wu, Huaqiang
    NATURE NANOTECHNOLOGY, 2020, 15 (09) : 776 - +
  • [22] Power-efficient neural network with artificial dendrites
    Xinyi Li
    Jianshi Tang
    Qingtian Zhang
    Bin Gao
    J. Joshua Yang
    Sen Song
    Wei Wu
    Wenqiang Zhang
    Peng Yao
    Ning Deng
    Lei Deng
    Yuan Xie
    He Qian
    Huaqiang Wu
    Nature Nanotechnology, 2020, 15 : 776 - 782
  • [23] Efficient Selection of Inputs for Artificial Neural Network Models
    Fernando, T. M. K. G.
    Maier, H. R.
    Dandy, G. C.
    May, R.
    MODSIM 2005: INTERNATIONAL CONGRESS ON MODELLING AND SIMULATION: ADVANCES AND APPLICATIONS FOR MANAGEMENT AND DECISION MAKING: ADVANCES AND APPLICATIONS FOR MANAGEMENT AND DECISION MAKING, 2005, : 1806 - 1812
  • [24] An efficient pruning and fine-tuning method for deep spiking neural network
    Meng, L. W.
    Qiao, G. C.
    Zhang, X. Y.
    Bai, J.
    Zuo, Y.
    Zhou, P. J.
    Liu, Y.
    Hu, S. G.
    APPLIED INTELLIGENCE, 2023, 53 (23) : 28910 - 28923
  • [25] Zero-Keep Filter Pruning for Energy Efficient Deep Neural Network
    Woo, Yunhee
    Kim, Dongyoung
    Jeong, Jaemin
    Ko, Young-Woong
    Lee, Jeong-Gun
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1288 - 1292
  • [26] Canopy pruning grade classification based on fast fourier transform and artificial neural network
    Tan, L. (litan@wsu.edu), 1600, American Society of Agricultural and Biological Engineers (57):
  • [27] CANOPY PRUNING GRADE CLASSIFICATION BASED ON FAST FOURIER TRANSFORM AND ARTIFICIAL NEURAL NETWORK
    Shao, Y.
    Tan, L.
    Zeng, B.
    Zhang, Q.
    TRANSACTIONS OF THE ASABE, 2014, 57 (03) : 963 - 971
  • [28] Artificial neural network technologies as a tool to histological preparation analysis
    Nikitina, M. A.
    Chernukha, I. M.
    Pchelkina, V. A.
    60TH INTERNATIONAL MEAT INDUSTRY CONFERENCE MEATCON2019, 2019, 333
  • [29] Artificial Neural Network Assisted Analog IC Sizing Tool
    Islamoglu, Gamze
    Cakici, Tugberk Ogulcan
    Afacan, Engin
    Dundar, Giinhan
    2019 16TH INTERNATIONAL CONFERENCE ON SYNTHESIS, MODELING, ANALYSIS AND SIMULATION METHODS AND APPLICATIONS TO CIRCUIT DESIGN (SMACD 2019), 2019, : 9 - 12
  • [30] Using artificial neural network as a tool for epidemiological data analysis
    Polak, S
    Mendyk, A
    Brandys, J
    NEURAL NETWORKS AND SOFT COMPUTING, 2003, : 486 - 491