Performance analysis of efficient pre-trained networks based on transfer learning for tomato leaf diseases classification

被引:0
|
作者
Gharghory S.M. [1 ]
机构
[1] Computers and Systems Department, Electronics Research Institute, Giza
关键词
Alex; Deep learning; Squeeze; Tomato leaf diseases diagnosis and classification; VGG16; networks;
D O I
10.14569/IJACSA.2020.0110830
中图分类号
学科分类号
摘要
Early diagnosis and accurate identification to tomato leaf diseases contribute on controlling the diffusion of infection and guarantee healthy to the plant which in role result in increasing the crop harvest. Nine common types of tomato leaf diseases have a great effect on the quality and quantity of tomato crop yield. The tradition approaches of features extraction and image classification cannot ensure a high accuracy rate for leaf diseases identification. This paper suggests an automatic detection approach for tomato leaf diseases based on the fine tuning and transfer learning to the pre-trained of deep Convolutional Neural Networks. Three pre-trained deep networks based on transfer learning: AlexNet, VGG-16 Net and SqueezeNet are suggested for their performances analysis in tomato leaf diseases classification. The proposed networks are carried out on two different dataset, one of them is a small dataset using only four different diseases while the other is a large dataset of leaves accompanied with symptoms of nine diseases and healthy leaves. The performance of the suggested networks is evaluated in terms of classification accuracy and the elapsed time during their training. The performance of the suggested networks using the small dataset are also compared with that of the-state-of-the-art technique in literature. The experimental results with the small dataset demonstrate that the accuracy of classification of the suggested networks outperform by 8.1% and 15% over the classification accuracy of the technique in literature. On other side when using the large dataset, the proposed pre-trained AlexNet achieves high classification accuracy by 97.4% and the consuming time during its training is lower than those of the other pre-trained networks. Generally, it can be concluded that AlexNet has outstanding performance for diagnosing the tomato leaf diseases in terms of accuracy and execution time compared to the other networks. On contrary, the performance of VGG-16 Net in metric of classification accuracy is the best yet the largest consuming time among other networks. © 2020, Science and Information Organization.
引用
收藏
页码:230 / 240
页数:10
相关论文
共 50 条
  • [41] Performance Evaluation of CNN and Pre-trained Models for Malware Classification
    Omar Habibi
    Mohammed Chemmakha
    Mohamed Lazaar
    Arabian Journal for Science and Engineering, 2023, 48 : 10355 - 10369
  • [42] Performance Evaluation of CNN and Pre-trained Models for Malware Classification
    Habibi, Omar
    Chemmakha, Mohammed
    Lazaar, Mohamed
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2023, 48 (08) : 10355 - 10369
  • [43] Meta Distant Transfer Learning for Pre-trained Language Models
    Wang, Chengyu
    Pan, Haojie
    Qiu, Minghui
    Yang, Fei
    Huang, Jun
    Zhang, Yin
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9742 - 9752
  • [44] Transfer Learning from Pre-trained BERT for Pronoun Resolution
    Bao, Xingce
    Qiao, Qianqian
    GENDER BIAS IN NATURAL LANGUAGE PROCESSING (GEBNLP 2019), 2019, : 82 - 88
  • [45] TransTailor: Pruning the Pre-trained Model for Improved Transfer Learning
    Liu, Bingyan
    Cai, Yifeng
    Guo, Yao
    Chen, Xiangqun
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8627 - 8634
  • [46] Pre-trained combustion model and transfer learning in thermoacoustic instability
    Qin, Ziyu
    Wang, Xinyao
    Han, Xiao
    Lin, Yuzhen
    Zhou, Yuchen
    PHYSICS OF FLUIDS, 2023, 35 (03)
  • [47] LogME: Practical Assessment of Pre-trained Models for Transfer Learning
    You, Kaichao
    Liu, Yong
    Wang, Jianmin
    Long, Mingsheng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [48] Exploratory Architectures Analysis of Various Pre-trained Image Classification Models for Deep Learning
    Deepa, S.
    Zeema, J. Loveline
    Gokila, S.
    JOURNAL OF ADVANCES IN INFORMATION TECHNOLOGY, 2024, 15 (01) : 66 - 78
  • [49] Enhancing coffee bean classification: a comparative analysis of pre-trained deep learning models
    Hassan E.
    Neural Computing and Applications, 2024, 36 (16) : 9023 - 9052
  • [50] Examining the classification performance of pre-trained capsule networks on imbalanced bone marrow cell dataset
    Atasoy, Nesrin Aydin
    Al Rahhawi, Amina Faris Abdulla
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2024, 34 (03)