Knowledge Distillation Facilitates the Lightweight and Efficient Plant Diseases Detection Model

被引:15
|
作者
Huang, Qianding [1 ]
Wu, Xingcai [1 ]
Wang, Qi [1 ,2 ]
Dong, Xinyu [1 ]
Qin, Yongbin [2 ]
Wu, Xue [1 ,3 ]
Gao, Yangyang [3 ]
Hao, Gefei [1 ,3 ]
机构
[1] Guizhou Univ, Coll Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[2] Guizhou Univ, Text Comp & Cognit Intelligence Engn Res Ctr, Natl Educ Minist, Guiyang 550025, Peoples R China
[3] Guizhou Univ, Natl Key Lab Green Pesticide, Guiyang 550025, Peoples R China
基金
中国国家自然科学基金;
关键词
Detection models - Disease detection - Efficient plants - Food production - Large-scales - Modeling parameters - Multi-stages - Objects detection - Plant disease - Plant disease diagnosis;
D O I
10.34133/plantphenomics.0062
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
Plant disease diagnosis in time can inhibit the spread of the disease and prevent a large-scale drop in production, which benefits food production. Object detection-based plant disease diagnosis methods have attracted widespread attention due to their accuracy in classifying and locating diseases. However, existing methods are still limited to single crop disease diagnosis. More importantly, the existing model has a large number of parameters, which is not conducive to deploying it to agricultural mobile devices. Nonetheless, reducing the number of model parameters tends to cause a decrease in model accuracy. To solve these problems, we propose a plant disease detection method based on knowledge distillation to achieve a lightweight and efficient diagnosis of multiple diseases across multiple crops. In detail, we design 2 strategies to build 4 different lightweight models as student models: the YOLOR-Light-v1, YOLOR-Light-v2, Mobile-YOLOR-v1, and Mobile-YOLOR-v2 models, and adopt the YOLOR model as the teacher model. We develop a multistage knowledge distillation method to improve lightweight model performance, achieving 60.4% mAP@ .5 in the PlantDoc dataset with small model parameters, outperforming existing methods. Overall, the multistage knowledge distillation technique can make the model lighter while maintaining high accuracy. Not only that, the technique can be extended to other tasks, such as image classification and image segmentation, to obtain automated plant disease diagnostic models with a wider range of lightweight applicability in smart agriculture. Our code is available at https://github.com/QDH/MSKD.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Lightweight defect detection algorithm of tunnel lining based on knowledge distillation
    Zhu, Anfu
    Xie, Jiaxiao
    Wang, Bin
    Guo, Heng
    Guo, Zilong
    Wang, Jie
    Xu, Lei
    Zhu, Sixin
    Yang, Zhanping
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [22] Research on a lightweight electronic component detection method based on knowledge distillation
    Xia, Zilin
    Gu, Jinan
    Wang, Wenbo
    Huang, Zedong
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (12) : 20971 - 20994
  • [23] Reconstructed Graph Neural Network With Knowledge Distillation for Lightweight Anomaly Detection
    Zhou, Xiaokang
    Wu, Jiayi
    Liang, Wei
    Wang, Kevin I-Kai
    Yan, Zheng
    Yang, Laurence T.
    Jin, Qun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11817 - 11828
  • [24] KDSMALL: A lightweight small object detection algorithm based on knowledge distillation
    Zhou, Wen
    Wang, Xiaodon
    Fan, Yusheng
    Yang, Yishuai
    Wen, Yihan
    Li, Yixuan
    Xu, Yicheng
    Lin, Zhengyuan
    Chen, Langlang
    Yao, Shizhou
    Zequn, Liu
    Wang, Jianqing
    COMPUTER COMMUNICATIONS, 2024, 219 : 271 - 281
  • [25] Lightweight Intrusion Detection System with GAN-based Knowledge Distillation
    Ali, Tarek
    Eleyan, Amna
    Bejaoui, Tarek
    Al-Khalidi, Mohammed
    2024 INTERNATIONAL CONFERENCE ON SMART APPLICATIONS, COMMUNICATIONS AND NETWORKING, SMARTNETS-2024, 2024,
  • [26] Learning Efficient Object Detection Models with Knowledge Distillation
    Chen, Guobin
    Choi, Wongun
    Yu, Xiang
    Han, Tony
    Chandraker, Manmohan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [27] Structured Knowledge Distillation for Accurate and Efficient Object Detection
    Zhang, Linfeng
    Ma, Kaisheng
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (12) : 15706 - 15724
  • [28] DUAL KNOWLEDGE DISTILLATION FOR EFFICIENT SOUND EVENT DETECTION
    Xiao, Yang
    Das, Rohan Kumar
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING WORKSHOPS, ICASSPW 2024, 2024, : 690 - 694
  • [29] ELFLN: An Efficient Lightweight Facial Landmark Network Based on Hybrid Knowledge Distillation
    Chen, Shidong
    Wang, Yalun
    Bian, Huicong
    Lu, Qin
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 485 - 497
  • [30] Knowledge distillation for object detection with diffusion model
    Zhang, Yi
    Long, Junzong
    Li, Chunrui
    NEUROCOMPUTING, 2025, 636