Knowledge Distillation Facilitates the Lightweight and Efficient Plant Diseases Detection Model

被引:15
|
作者
Huang, Qianding [1 ]
Wu, Xingcai [1 ]
Wang, Qi [1 ,2 ]
Dong, Xinyu [1 ]
Qin, Yongbin [2 ]
Wu, Xue [1 ,3 ]
Gao, Yangyang [3 ]
Hao, Gefei [1 ,3 ]
机构
[1] Guizhou Univ, Coll Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[2] Guizhou Univ, Text Comp & Cognit Intelligence Engn Res Ctr, Natl Educ Minist, Guiyang 550025, Peoples R China
[3] Guizhou Univ, Natl Key Lab Green Pesticide, Guiyang 550025, Peoples R China
基金
中国国家自然科学基金;
关键词
Detection models - Disease detection - Efficient plants - Food production - Large-scales - Modeling parameters - Multi-stages - Objects detection - Plant disease - Plant disease diagnosis;
D O I
10.34133/plantphenomics.0062
中图分类号
S3 [农学(农艺学)];
学科分类号
0901 ;
摘要
Plant disease diagnosis in time can inhibit the spread of the disease and prevent a large-scale drop in production, which benefits food production. Object detection-based plant disease diagnosis methods have attracted widespread attention due to their accuracy in classifying and locating diseases. However, existing methods are still limited to single crop disease diagnosis. More importantly, the existing model has a large number of parameters, which is not conducive to deploying it to agricultural mobile devices. Nonetheless, reducing the number of model parameters tends to cause a decrease in model accuracy. To solve these problems, we propose a plant disease detection method based on knowledge distillation to achieve a lightweight and efficient diagnosis of multiple diseases across multiple crops. In detail, we design 2 strategies to build 4 different lightweight models as student models: the YOLOR-Light-v1, YOLOR-Light-v2, Mobile-YOLOR-v1, and Mobile-YOLOR-v2 models, and adopt the YOLOR model as the teacher model. We develop a multistage knowledge distillation method to improve lightweight model performance, achieving 60.4% mAP@ .5 in the PlantDoc dataset with small model parameters, outperforming existing methods. Overall, the multistage knowledge distillation technique can make the model lighter while maintaining high accuracy. Not only that, the technique can be extended to other tasks, such as image classification and image segmentation, to obtain automated plant disease diagnostic models with a wider range of lightweight applicability in smart agriculture. Our code is available at https://github.com/QDH/MSKD.
引用
收藏
页数:20
相关论文
共 50 条
  • [31] Lightweight UAV Object-Detection Method Based on Efficient Multidimensional Global Feature Adaptive Fusion and Knowledge Distillation
    Sun, Jian
    Gao, Hongwei
    Yan, Zhiwen
    Qi, Xiangjing
    Yu, Jiahui
    Ju, Zhaojie
    ELECTRONICS, 2024, 13 (08)
  • [32] A Lightweight Approach for Network Intrusion Detection based on Self-Knowledge Distillation
    Yang, Shuo
    Zheng, Xinran
    Xu, Zhengzhuo
    Wang, Xingjun
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3000 - 3005
  • [33] A Lightweight Pig Face Recognition Method Based on Automatic Detection and Knowledge Distillation
    Ma, Ruihan
    Ali, Hassan
    Chung, Seyeon
    Kim, Sang Cheol
    Kim, Hyongsuk
    APPLIED SCIENCES-BASEL, 2024, 14 (01):
  • [34] Spatial-temporal knowledge distillation for lightweight network traffic anomaly detection
    Wang, Xintong
    Wang, Zixuan
    Wang, Enliang
    Sun, Zhixin
    COMPUTERS & SECURITY, 2024, 137
  • [35] Lightweight Inception Networks for the Recognition and Detection of Rice Plant Diseases
    Chen, Junde
    Chen, Weirong
    Zeb, Adan
    Yang, Shuangyuan
    Zhang, Defu
    IEEE SENSORS JOURNAL, 2022, 22 (14) : 14628 - 14638
  • [36] Toward Efficient Image Denoising: A Lightweight Network with Retargeting Supervision Driven Knowledge Distillation
    Zou, Beiji
    Zhang, Yue
    Wang, Min
    Liu, Shu
    ADVANCES IN COMPUTER GRAPHICS, CGI 2022, 2022, 13443 : 15 - 27
  • [37] Knowledge distillation for efficient standard scanplane detection of fetal ultrasound
    Jacopo Dapueto
    Luca Zini
    Francesca Odone
    Medical & Biological Engineering & Computing, 2024, 62 : 73 - 82
  • [38] Knowledge distillation approach for skin cancer classification on lightweight deep learning model
    Saha, Suman
    Hemal, Md. Moniruzzaman
    Eidmum, Md. Zunead Abedin
    Mridha, Muhammad Firoz
    Healthcare Technology Letters, 2025, 12 (01)
  • [39] Lightweight Model Pre-Training via Language Guided Knowledge Distillation
    Li, Mingsheng
    Zhang, Lin
    Zhu, Mingzhen
    Huang, Zilong
    Yu, Gang
    Fan, Jiayuan
    Chen, Tao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 10720 - 10730
  • [40] Learning Lightweight Face Detector with Knowledge Distillation
    Jin, Haibo
    Zhang, Shifeng
    Zhu, Xiangyu
    Tang, Yinhang
    Lei, Zhen
    Li, Stan Z.
    2019 INTERNATIONAL CONFERENCE ON BIOMETRICS (ICB), 2019,