Estimation of Off-Target Dicamba Damage on Soybean Using UAV Imagery and Deep Learning

被引:5
|
作者
Tian, Fengkai [1 ]
Vieira, Caio Canella [2 ]
Zhou, Jing [3 ]
Zhou, Jianfeng [4 ]
Chen, Pengyin [4 ]
机构
[1] Univ Missouri, Dept Biomed Biol & Chem Engn, Columbia, MO 65211 USA
[2] Univ Arkansas, Bumpers Coll, Crop Soil & Environm Sci, Fayetteville, AR 72701 USA
[3] Univ Wisconsin Madison, Biol Syst Engn, Madison, WI 53706 USA
[4] Univ Missouri, Div Plant Sci & Technol, Columbia, MO 65211 USA
关键词
soybean; dicamba tolerance; high-throughput phenotyping; deep learning; RECOGNITION; CNN;
D O I
10.3390/s23063241
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Weeds can cause significant yield losses and will continue to be a problem for agricultural production due to climate change. Dicamba is widely used to control weeds in monocot crops, especially genetically engineered dicamba-tolerant (DT) dicot crops, such as soybean and cotton, which has resulted in severe off-target dicamba exposure and substantial yield losses to non-tolerant crops. There is a strong demand for non-genetically engineered DT soybeans through conventional breeding selection. Public breeding programs have identified genetic resources that confer greater tolerance to off-target dicamba damage in soybeans. Efficient and high throughput phenotyping tools can facilitate the collection of a large number of accurate crop traits to improve the breeding efficiency. This study aimed to evaluate unmanned aerial vehicle (UAV) imagery and deep-learning-based data analytic methods to quantify off-target dicamba damage in genetically diverse soybean genotypes. In this research, a total of 463 soybean genotypes were planted in five different fields (different soil types) with prolonged exposure to off-target dicamba in 2020 and 2021. Crop damage due to off-target dicamba was assessed by breeders using a 1-5 scale with a 0.5 increment, which was further classified into three classes, i.e., susceptible (>= 3.5), moderate (2.0 to 3.0), and tolerant (<= 1.5). A UAV platform equipped with a red-green-blue (RGB) camera was used to collect images on the same days. Collected images were stitched to generate orthomosaic images for each field, and soybean plots were manually segmented from the orthomosaic images. Deep learning models, including dense convolutional neural network-121 (DenseNet121), residual neural network-50 (ResNet50), visual geometry group-16 (VGG16), and Depthwise Separable Convolutions (Xception), were developed to quantify crop damage levels. Results show that the DenseNet121 had the best performance in classifying damage with an accuracy of 82%. The 95% binomial proportion confidence interval showed a range of accuracy from 79% to 84% (p-value <= 0.01). In addition, no extreme misclassifications (i.e., misclassification between tolerant and susceptible soybeans) were observed. The results are promising since soybean breeding programs typically aim to identify those genotypes with 'extreme' phenotypes (e.g., the top 10% of highly tolerant genotypes). This study demonstrates that UAV imagery and deep learning have great potential to high-throughput quantify soybean damage due to off-target dicamba and improve the efficiency of crop breeding programs in selecting soybean genotypes with desired traits.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Deep learning improves the ability of sgRNA off-target propensity prediction
    Liu, Qiaoyue
    Cheng, Xiang
    Liu, Gan
    Li, Bohao
    Liu, Xiuqin
    BMC BIOINFORMATICS, 2020, 21 (01)
  • [22] Dicamba off-target movement from applications on soybeans at two growth stages
    Kruger, Greg R.
    Alves, Guilherme S.
    Schroeder, Kasey
    Golus, Jeffrey A.
    Reynolds, Daniel B.
    Dodds, Darrin M.
    Brown, Ashli E.
    Fritz, Bradley K.
    Hoffmann, Wesley C.
    AGROSYSTEMS GEOSCIENCES & ENVIRONMENT, 2023, 6 (02)
  • [23] Evaluating spatial scale effects of dicamba applications on off-target vapor movement
    Orr, Tom
    Pai, Naresh
    Sall, Erik
    DesAutels, Christopher
    Popovic, Jelena
    Reiss, Richard
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2018, 256
  • [24] Off-target predictions in CRISPR-Cas9 gene editing using deep learning
    Lin, Jiecong
    Wong, Ka-Chun
    BIOINFORMATICS, 2018, 34 (17) : 656 - 663
  • [25] Chestnut Burr Segmentation for Yield Estimation Using UAV-Based Imagery and Deep Learning
    Carneiro, Gabriel A.
    Santos, Joaquim
    Sousa, Joaquim J.
    Cunha, Antonio
    Padua, Luis
    DRONES, 2024, 8 (10)
  • [26] Improved Estimation of Aboveground Biomass in Rubber Plantations Using Deep Learning on UAV Multispectral Imagery
    Tan, Hongjian
    Kou, Weili
    Xu, Weiheng
    Wang, Leiguang
    Wang, Huan
    Lu, Ning
    DRONES, 2025, 9 (01)
  • [27] Inference of drug off-target effects on cellular signaling using interactome-based deep learning
    Meimetis, Nikolaos
    Lauffenburger, Douglas A.
    Nilsson, Avlant
    ISCIENCE, 2024, 27 (04)
  • [28] Damage detection with an autonomous UAV using deep learning
    Kang, Dongho
    Cha, Young-Jin
    SENSORS AND SMART STRUCTURES TECHNOLOGIES FOR CIVIL, MECHANICAL, AND AEROSPACE SYSTEMS 2018, 2018, 10598
  • [29] Prediction of on-target and off-target activity of CRISPR–Cas13d guide RNAs using deep learning
    Hans-Hermann Wessels
    Andrew Stirn
    Alejandro Méndez-Mancilla
    Eric J. Kim
    Sydney K. Hart
    David A. Knowles
    Neville E. Sanjana
    Nature Biotechnology, 2024, 42 : 628 - 637
  • [30] Prediction of on-target and off-target activity of CRISPR-Cas13d guide RNAs using deep learning
    Wessels, Hans-Hermann
    Stirn, Andrew
    Mendez-Mancilla, Alejandro
    Kim, Eric J.
    Hart, Sydney K.
    Knowles, David A.
    Sanjana, Neville E.
    NATURE BIOTECHNOLOGY, 2024, 42 (04) : 628 - 637