CSKNN: Cost-sensitive K-Nearest Neighbor using hyperspectral imaging for identification of wheat varieties

被引:7
|
作者
Jin, Songlin [1 ]
Zhang, Fengfan [2 ]
Zheng, Ying [1 ]
Zhou, Ling [1 ]
Zuo, Xiangang [1 ]
Zhang, Ziyang [4 ]
Zhao, Wenyi [3 ]
Zhang, Weidong [1 ]
Pan, Xipeng [5 ]
机构
[1] Henan Inst Sci & Technol, Sch Informat Engn, Xinxiang 453003, Peoples R China
[2] Zhengzhou Univ Ind Technol, Business Sch, Zhengzhou 451150, Peoples R China
[3] Beijing Univ Posts & Telecommun, Sch Artificial Intelligence, Beijing 100876, Peoples R China
[4] Henan Inst Sci & Technol, Sch Life Sci & Technol, Xinxiang 453003, Peoples R China
[5] Guilin Univ Elect Technol, Sch Comp Sci & Informat Secur, Guilin 541004, Peoples R China
关键词
Hyperspectral imaging; Cost-sensitive K-Nearest Neighbor; Smoothing denoising; Linear discriminant analysis;
D O I
10.1016/j.compeleceng.2023.108896
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Hyperspectral imaging techniques are widely used for rapid, efficient, and non-destructive identification of wheat varieties. However, the interference of noise in hyperspectral images and the underutilization of spatial information by most methods are two challenging issues in identifying wheat varieties. In this paper, we present a new approach called Cost-sensitive K-Nearest Neighbor using Hyperspectral imaging (CSKNN) to address these issues. First, we fuse 128 bands acquired by hyperspectral imaging equipment to obtain hyperspectral images of wheat grains, and employ a central regionalization strategy to extract the region of interest. We then use a smoothing denoising strategy to remove noise from the hyperspectral images and improve the saliency of the object grains. Furthermore, we consider the characteristics of different bands and use linear discriminant analysis to compress features, reducing intra-class differences and increasing inter-class differences. Finally, we propose a Cost-sensitive KNN for training and testing of wheat varieties. Our experiments on different strains and varieties of wheat datasets in the same region show that our CSKNN achieves high classification accuracies of 98.09% and 97.45%, outperforming state-of-the-art methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Sublinear time approximation of the cost of a metric k-nearest neighbor graph
    Czumaj, Artur
    Sohler, Christian
    PROCEEDINGS OF THE THIRTY-FIRST ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS (SODA'20), 2020, : 2973 - 2992
  • [22] Sublinear time approximation of the cost of a metric k-nearest neighbor graph
    Czumaj, Artur
    Sohler, Christian
    PROCEEDINGS OF THE 2020 ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, SODA, 2020, : 2973 - 2992
  • [23] Fast k-nearest Neighbor Search for Face Identification Using Bounds of Residual Score
    Ishii, Masato
    Imaoka, Hitoshi
    Sato, Atsushi
    2017 12TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2017), 2017, : 194 - 199
  • [24] Hyperspectral imagery classification using the combination of improved laplacian eigenmaps and improved k-nearest neighbor classifier
    Sun, Weiwei
    Liu, Chun
    Li, Weiyue
    Wuhan Daxue Xuebao (Xinxi Kexue Ban)/Geomatics and Information Science of Wuhan University, 2015, 40 (09): : 1151 - 1156
  • [25] Ultrasound k-nearest neighbor entropy imaging: Theory, algorithm, and applications
    Li, Sinan
    Tsui, Po-Hsiang
    Wu, Weiwei
    Wu, Shuicai
    Zhou, Zhuhuang
    ULTRASONICS, 2024, 138
  • [26] k-Nearest Neighbor Regressors Optimized by using Random Search
    Ortiz-Bejar, Jose
    Graff, Mario
    Tellez, Eric S.
    Ortiz-Bejar, Jesus
    Cerda Jacobo, Jaime
    2018 IEEE INTERNATIONAL AUTUMN MEETING ON POWER, ELECTRONICS AND COMPUTING (ROPEC), 2018,
  • [27] A Fast k-Nearest Neighbor Classifier Using Unsupervised Clustering
    Vajda, Szilard
    Santosh, K. C.
    RECENT TRENDS IN IMAGE PROCESSING AND PATTERN RECOGNITION (RTIP2R 2016), 2017, 709 : 185 - 193
  • [28] Using a genetic algorithm for editing k-nearest neighbor classifiers
    Gil-Pita, R.
    Yao, X.
    INTELLIGENT DATA ENGINEERING AND AUTOMATED LEARNING - IDEAL 2007, 2007, 4881 : 1141 - +
  • [29] Identification of model order and number of neighbors for k-nearest neighbor resampling
    Lee, Taesam
    Ouarda, Taha B. M. J.
    JOURNAL OF HYDROLOGY, 2011, 404 (3-4) : 136 - 145
  • [30] Predicting Math Test Scores Using K-Nearest Neighbor
    Brown, Jessica Maikhanh
    PROCEEDINGS OF THE 2017 7TH IEEE INTEGRATED STEM EDUCATION CONFERENCE (ISEC), 2017, : 104 - 106