Identification Method of Citrus Aurantium Diseases and Pests Based on Deep Convolutional Neural Network

被引:4
|
作者
Lin, Yuke [1 ]
Xu, Jin [2 ]
Zhang, Ying [3 ]
机构
[1] Chongqing Coll Elect Engn, Dept Commun Engn, Chongqing 401331, Peoples R China
[2] Chongqing Acad Chinese Mat Med, Dabashan Branch, Chongqing 400065, Peoples R China
[3] Chongqing Ctr Dis Control & Prevent, Disinfect & Vector Control Inst, Chongqing 400042, Peoples R China
关键词
461.4 Ergonomics and Human Factors Engineering - 716.1 Information Theory and Signal Processing;
D O I
10.1155/2022/7012399
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
The traditional identification methods of Citrus aurantium diseases and pests are prone to convergence during the running process, resulting in low accuracy of identification. To this end, this study reviews the newest methods for the identification of Citrus aurantium diseases and pests based on a deep convolutional neural network (DCNN). The initial images of Citrus aurantium leaves are collected by hardware equipment and then preprocessed using the techniques of cropping, enhancement, and morphological transformation. By using the neural network to divide the disease spots of Citrus aurantium images, accurate recognition results are obtained through feature matching. The comparative experimental results show that, compared with the traditional recognition method, the recognition rate of the proposed method has increased by about 11.9%, indicating its better performance. The proposed method can overcome the interference of the external environment to a certain extent and can provide reference data for the prevention and control of Citrus aurantium diseases and pests.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] A deep convolutional neural network for OCT-based detection of corneal diseases
    Li, Yan
    Pavlatos, Elias
    Chamberlain, Winston
    Huang, David
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2023, 64 (08)
  • [32] Recognition Method for Handwritten Steel Billet Identification Number Based on Yolo Deep Convolutional Neural Network
    Sun, Qiaojie
    Chen, Dali
    Wang, Sen
    Liu, Shixin
    PROCEEDINGS OF THE 32ND 2020 CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2020), 2020, : 5642 - 5646
  • [33] Tomato Diseases and Pests Detection Based on Improved Yolo V3 Convolutional Neural Network
    Liu, Jun
    Wang, Xuewei
    FRONTIERS IN PLANT SCIENCE, 2020, 11
  • [34] Distracted driving recognition method based on deep convolutional neural network
    Xuli Rao
    Feng Lin
    Zhide Chen
    Jiaxu Zhao
    Journal of Ambient Intelligence and Humanized Computing, 2021, 12 : 193 - 200
  • [35] Cooperative Spectrum Sensing Method Based on Deep Convolutional Neural Network
    Gai Jianxin
    Xue Xianfeng
    Wu Jingyi
    Nan Ruixiang
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2021, 43 (10) : 2911 - 2919
  • [36] Cloud Image Classification Method Based on Deep Convolutional Neural Network
    Zhang F.
    Yan J.
    Xibei Gongye Daxue Xuebao/Journal of Northwestern Polytechnical University, 2020, 38 (04): : 740 - 746
  • [37] An inverse halftoning method based on supervised deep convolutional neural network
    Li, Mei
    Liu, Qi
    IET IMAGE PROCESSING, 2024, 18 (04) : 961 - 971
  • [38] A Reconstruction Method Based on Deep Convolutional Neural Network for SPECT Imaging
    Chrysostomou, Charalambos
    Koutsantonis, Loizos
    Lemesios, Christos
    Papanicolas, Costas N.
    2018 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE PROCEEDINGS (NSS/MIC), 2018,
  • [39] A face sequence recognition method based on deep convolutional neural network
    Ma, Siwei
    Cao, Meng
    Li, Jiadong
    Zhu, Quanyin
    Li, Xiang
    Shen, Yi
    Wang, Mengdi
    2019 18TH INTERNATIONAL SYMPOSIUM ON DISTRIBUTED COMPUTING AND APPLICATIONS FOR BUSINESS ENGINEERING AND SCIENCE (DCABES 2019), 2019, : 100 - 103
  • [40] Bone Age Assessment Method based on Deep Convolutional Neural Network
    Bian, Zengya
    Zhang, Runtong
    2018 8TH INTERNATIONAL CONFERENCE ON ELECTRONICS INFORMATION AND EMERGENCY COMMUNICATION (ICEIEC), 2018, : 194 - 197