Knowledge Distillation Facilitates the Lightweight and Efficient Plant Diseases Detection Model

被引:15
|
作者
Huang, Qianding [1 ]
Wu, Xingcai [1 ]
Wang, Qi [1 ,2 ]
Dong, Xinyu [1 ]
Qin, Yongbin [2 ]
Wu, Xue [1 ,3 ]
Gao, Yangyang [3 ]
Hao, Gefei [1 ,3 ]
机构
[1] Guizhou Univ, Coll Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[2] Guizhou Univ, Text Comp & Cognit Intelligence Engn Res Ctr, Natl Educ Minist, Guiyang 550025, Peoples R China
[3] Guizhou Univ, Natl Key Lab Green Pesticide, Guiyang 550025, Peoples R China
基金
中国国家自然科学基金;
关键词
Detection models - Disease detection - Efficient plants - Food production - Large-scales - Modeling parameters - Multi-stages - Objects detection - Plant disease - Plant disease diagnosis;
D O I
10.34133/plantphenomics.0062
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
Plant disease diagnosis in time can inhibit the spread of the disease and prevent a large-scale drop in production, which benefits food production. Object detection-based plant disease diagnosis methods have attracted widespread attention due to their accuracy in classifying and locating diseases. However, existing methods are still limited to single crop disease diagnosis. More importantly, the existing model has a large number of parameters, which is not conducive to deploying it to agricultural mobile devices. Nonetheless, reducing the number of model parameters tends to cause a decrease in model accuracy. To solve these problems, we propose a plant disease detection method based on knowledge distillation to achieve a lightweight and efficient diagnosis of multiple diseases across multiple crops. In detail, we design 2 strategies to build 4 different lightweight models as student models: the YOLOR-Light-v1, YOLOR-Light-v2, Mobile-YOLOR-v1, and Mobile-YOLOR-v2 models, and adopt the YOLOR model as the teacher model. We develop a multistage knowledge distillation method to improve lightweight model performance, achieving 60.4% mAP@ .5 in the PlantDoc dataset with small model parameters, outperforming existing methods. Overall, the multistage knowledge distillation technique can make the model lighter while maintaining high accuracy. Not only that, the technique can be extended to other tasks, such as image classification and image segmentation, to obtain automated plant disease diagnostic models with a wider range of lightweight applicability in smart agriculture. Our code is available at https://github.com/QDH/MSKD.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Structured Attention Knowledge Distillation for Lightweight Networks
    Gu Xiaowei
    Hui, Tian
    Dai Zhongjian
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 1726 - 1730
  • [42] RTR_Lite_MobileNetV2: A lightweight and efficient model for plant disease detection and classification
    Duhan, Sangeeta
    Gulia, Preeti
    Gill, Nasib Singh
    Narwal, Ekta
    CURRENT PLANT BIOLOGY, 2025, 42
  • [43] Lightweight Spectrum Prediction Based on Knowledge Distillation
    Cheng, Runmeng
    Zhang, Jianzhao
    Deng, Junquan
    Zhu, Yanping
    RADIOENGINEERING, 2023, 32 (04) : 469 - 478
  • [44] Dual model knowledge distillation for industrial anomaly detection
    Thomine, Simon
    Snoussi, Hichem
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (03)
  • [45] Feature-based knowledge distillation for explainable detection of pulmonary diseases
    Piperno, Ruben
    Bacco, Luca
    Petrosino, Lorenzo
    Matarrese, Margherita A. G.
    Merone, Mario
    Pecchia, Leandro
    HEALTH AND TECHNOLOGY, 2025, : 405 - 415
  • [46] KD-LightNet: A Lightweight Network Based on Knowledge Distillation for Industrial Defect Detection
    Liu, Jinhai
    Li, Hengguang
    Zuo, Fengyuan
    Zhao, Zhen
    Lu, Senxiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [47] Knowledge Distillation Approach for Efficient Internal Language Model Estimation
    Chen, Zhipeng
    Xu, Haihua
    Khassanov, Yerbolat
    He, Yi
    Lu, Lu
    Ma, Zejun
    Wu, Ji
    INTERSPEECH 2023, 2023, : 1339 - 1343
  • [48] Plant disease detection based on lightweight CNN model
    Liu, Yang
    Gao, Guoqin
    Zhang, Zhenhui
    2021 4TH INTERNATIONAL CONFERENCE ON INFORMATION AND COMPUTER TECHNOLOGIES (ICICT 2021), 2021, : 64 - 68
  • [49] Poster Abstract: Efficient Knowledge Distillation to Train Lightweight Neural Network for Heterogeneous Edge Devices
    Kumari, Preti
    Gupta, Hari Prabhat
    Sikdar, Biplab
    PROCEEDINGS OF THE 21ST ACM CONFERENCE ON EMBEDDED NETWORKED SENSOR SYSTEMS, SENSYS 2023, 2023, : 546 - 547
  • [50] CerviSegNet-DistillPlus: An Efficient Knowledge Distillation Model for Enhancing Early Detection of Cervical Cancer Pathology
    Kang, Jie
    Li, Ning
    IEEE ACCESS, 2024, 12 : 85134 - 85149