High-precision extraction method for maize planting information based on UAV RGB images

被引:0
|
作者
Zhi J. [1 ,2 ]
Dong Y. [3 ]
Lu L. [4 ]
Shi J. [1 ]
Luo W. [1 ]
Zhou Y. [1 ]
Geng T. [1 ]
Xia J. [1 ]
Jia C. [1 ]
机构
[1] School of Geography and Tourism, Anhui Normal University, Wuhu
[2] Key Laboratory of Earth Surface Processes and Regional Response in the Yangtze-Huaihe River Basin, Wuhu
[3] College of Environmental and Resources Sciences, Zhejiang Forestry University, Hangzhou
[4] Natural Resources and Planning Bureau of Taihu County, Anqing
关键词
Machine learning; Maize; Models; Remote sensing; Texture feature; Unmanned aerial vehicle; Vegetation index;
D O I
10.11975/j.issn.1002-6819.2021.18.006
中图分类号
学科分类号
摘要
Ultra-high-resolution aerial images obtained from Unmanned Aerial Vehicles (UAVs) have widely been used to extract crop planting information in recent years. However, some high-resolution multispectral or hyperspectral images were usually costly and time-consuming for data processing. Therefore, it is very necessary to effectively use easily accessible and low-cost high-resolution RGB images, particularly to eliminate the common noises (e.g., shadows and bare land) for a better extraction accuracy of crop planting. In this study, a high-precision extraction method was proposed to obtain the maize planting information using 1.8 cm resolution UAV aerial orthophotos (i.e., RGB images). The experimental maize farm was located in Southeast Africa, where images were taken at noon during the maize growing season. The classification features were also selected from the aspects of the spectrum, color space, and image texture. Then, five types of classification were selected to extract maize planting information, including Bayes, K-Nearest Neighbor (KNN), Support Vector Machine (SVM), Decision Tree (DT), and Random Forest (RF) algorithms). Firstly, an object-oriented interpreting platform (eCognition9.0 software) was selected to calculate the space transform of Hue, Saturation, and Intensity (HSI) color, eight types of RGB texture, and five vegetation indices, including the normalized green-red difference index, red-green ratio index, vegetation color verification index, visible-band difference vegetation index, and excess green vegetation index. Then, three types of feature space were constructed: 1) The first feature space was composed of three sub-feature spaces, i.e., vegetation indices, HSI color space features, and RGB image texture features; 2) The second feature space was composed of four sub-feature spaces, where three types of features were combined (i.e., vegetation indices, HSI color space, and RGB image texture) in pairs or total; 3) The third feature space was composed of the most optimal factors, where the dimension reduction was performed on the combination of all three types of features using RF. Subsequently, the RGB images were classified into three land-use types, including maize, bare land, and shadow. Bayes, KNN, SVM, DT, and RFs models were finally selected for the supervised classification with error matrix. The results showed that the optimal classification accuracy was obtained using neither a single feature nor all three types of features in total. More importantly, a combination of features was usually achieved higher accuracy than that of a single feature. Specifically, the best choice was the combination of HSI color and RGB image texture features using the RF, particularly with the total highest accuracy of 86.2% and a Kappa coefficient of 0.793. Additionally, the dimension reduction of features using RF models was neither significantly improved nor reduced classification accuracy (except for the SVM). However, the factors retained from the feature dimension reduction were easily explained suitable for the actual background and meaning. Furthermore, both classification efficiency and stability were improved greatly during this time. The finding can provide a specific solution for the high-precision extraction of crop planting information using UAV RGB images. © 2021, Editorial Department of the Transactions of the Chinese Society of Agricultural Engineering. All right reserved.
引用
下载
收藏
页码:48 / 54
页数:6
相关论文
共 30 条
  • [1] Su W, Zhang C, Yang J Y, Et al., Analysis of wavelet packet and statistical textures for object-oriented classification of forest-agriculture ecotones using SPOT 5 imagery, International Journal of Remote Sensing, 33, 11, pp. 3557-3579, (2012)
  • [2] Chen Zhongxin, Ren Jianqiang, Tang Huajun, Et al., Progress and prospective on agricultural remote sensing research and applications in China, Journal of Remote Sensing, 20, 5, pp. 748-767, (2016)
  • [3] Bastiaanssen W G M, Molden D J, Makin I W., Remote sensing for irrigated agriculture: Examples from research and possible applications, Agricultural Water Management, 46, 2, pp. 137-155, (2000)
  • [4] Chen Xiaokai, Li Fenling, Wang Yuna, Et al., Estimation of winter wheat leaf area index based on UAV hyperspectral remote sensing, Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 36, 22, pp. 40-49, (2020)
  • [5] Wang Jingjing, Shi Yun, Liu Hanhai, Study on band registration of UAV borne frame hyperspectral images, Journal of China Agricultural Resources and Regional Planning, 38, 9, pp. 63-71, (2017)
  • [6] Liu Bin, Shi Yun, Wu Wenbin, Et al., Crop classification based on UAV remote sensing image, Journal of China Agricultural Resources and Regional Planning, 40, 8, pp. 55-63, (2019)
  • [7] Susana D P, Rodriguez-Gonzalvez P, Hernandez-Lopez D, Et al., Vicarious radiometric calibration of a multispectral camera on board an unmanned aerial system, Remote Sensing, 6, 3, pp. 1918-1937, (2014)
  • [8] Dai Jianguo, Zhang Guoshun, Guo Peng, Et al., Classification method of main crops in northern Xinjiang based on UAV visible waveband images, Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 34, 18, pp. 122-129, (2018)
  • [9] Myint S W, Cober P, Brazel A, Et al., Per-pixels vs object-based classification of urban land covers extraction using high spatial resolution imagery, Remote Sensing of Environment, 115, 5, pp. 1145-1161, (2011)
  • [10] Randjelovic P, Djordjevic V, Milic S, Et al., Prediction of soybean plant density using a machine learning model and vegetation indices extracted from RGB images taken with a UAV, Agronomy, 10, 8, (2020)