Efficient kNN Classification With Different Numbers of Nearest Neighbors

被引:905
|
作者
Zhang, Shichao [1 ]
Li, Xuelong [2 ]
Zong, Ming [1 ]
Zhu, Xiaofeng [1 ]
Wang, Ruili [3 ]
机构
[1] Guangxi Normal Univ, Coll Comp Sci & Informat Technol, Guangxi Key Lab MIMS, Guilin 541004, Peoples R China
[2] Chinese Acad Sci, Xian Inst Opt & Precis Mech, Ctr OPT IMagery Anal & Learning, State Key Lab Transient Opt & Photon, Xian 710119, Shaanxi, Peoples R China
[3] Massey Univ, Inst Nat & Math Sci, Auckland 4442, New Zealand
关键词
Decision tree; k nearest neighbor (kNN) classification; sparse coding; IMAGE; SELECTION; EXTRACTION; REGRESSION; ALGORITHM;
D O I
10.1109/TNNLS.2017.2673241
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
k nearest neighbor (kNN) method is a popular classification method in data mining and statistics because of its simple implementation and significant classification performance. However, it is impractical for traditional kNN methods to assign a fixed k value (even though set by experts) to all test samples. Previous solutions assign different k values to different test samples by the cross validation method but are usually time-consuming. This paper proposes a kTree method to learn different optimal k values for different test/new samples, by involving a training stage in the kNN classification. Specifically, in the training stage, kTree method first learns optimal k values for all training samples by a new sparse reconstruction model, and then constructs a decision tree (namely, kTree) using training samples and the learned optimal k values. In the test stage, the kTree fast outputs the optimal k value for each test sample, and then, the kNN classification can be conducted using the learned optimal k value and all training samples. As a result, the proposed kTree method has a similar running cost but higher classification accuracy, compared with traditional kNN methods, which assign a fixed k value to all test samples. Moreover, the proposed kTree method needs less running cost but achieves similar classification accuracy, compared with the newly kNN methods, which assign different k values to different test samples. This paper further proposes an improvement version of kTree method (namely, k*Tree method) to speed its test stage by extra storing the information of the training samples in the leaf nodes of kTree, such as the training samples located in the leaf nodes, their kNNs, and the nearest neighbor of these kNNs. We call the resulting decision tree as k*Tree, which enables to conduct kNN classification using a subset of the training samples in the leaf nodes rather than all training samples used in the newly kNN methods. This actually reduces running cost of test stage. Finally, the experimental results on 20 real data sets showed that our proposed methods (i.e., kTree and k*Tree) are much more efficient than the compared methods in terms of classification tasks.
引用
收藏
页码:1774 / 1785
页数:12
相关论文
共 50 条
  • [21] Binary Classification Based on SVDD Projection and Nearest Neighbors
    Kang, Daesung
    Park, Jooyoung
    Principe, Jose C.
    2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,
  • [22] Under-bagging Nearest Neighbors for Imbalanced Classification
    Hang, Hanyuan
    Cai, Yuchao
    Yang, Hanfang
    Lin, Zhouchen
    Journal of Machine Learning Research, 2022, 23
  • [23] Adaptive κ-nearest-neighbor classification using a dynamic number of nearest neighbors
    Ougiaroglou, Stefanos
    Nanopoulos, Alexandros
    Papadopoulos, Apostolos N.
    Manolopoulos, Yannis
    Welzer-Druzovec, Tatjana
    ADVANCES IN DATABASES AND INFORMATION SYSTEMS, PROCEEDINGS, 2007, 4690 : 66 - +
  • [24] Locating Renewable Energy Generators Using K-Nearest Neighbors (KNN) Algorithm
    Asadi, Meysam
    Pourhossein, Kazem
    2019 IRANIAN CONFERENCE ON RENEWABLE ENERGY & DISTRIBUTED GENERATION (ICREDG), 2019,
  • [25] Study on an adaptive thermal comfort model with K-nearest-neighbors (KNN) algorithm
    Xiong, Lei
    Yao, Ye
    BUILDING AND ENVIRONMENT, 2021, 202
  • [26] Efficient Nearest Neighbors via Robust Sparse Hashing
    Cherian, Anoop
    Sra, Suvrit
    Morellas, Vassilios
    Papanikolopoulos, Nikolaos
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2014, 23 (08) : 3646 - 3655
  • [27] ITP-KNN: Encrypted Video Flow Identification Based on the Intermittent Traffic Pattern of Video and K-Nearest Neighbors Classification
    Liu, Youting
    Li, Shu
    Zhang, Chengwei
    Zheng, Chao
    Sun, Yong
    Liu, Qingyun
    COMPUTATIONAL SCIENCE - ICCS 2020, PT II, 2020, 12138 : 279 - 293
  • [28] An Efficient Ensemble Algorithm for Boosting k-Nearest Neighbors Classification Performance via Feature Bagging
    Nguyen, Huu-Hoa
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (06) : 767 - 776
  • [29] PERFORMANCE OF K-NEAREST NEIGHBORS ALGORITHM IN OPINION CLASSIFICATION
    Jedrzejewski, Krzysztof
    Zamorski, Maurycy
    FOUNDATIONS OF COMPUTING AND DECISION SCIENCES, 2013, 38 (02) : 97 - 110
  • [30] Diminishing Prototype Size for k-Nearest Neighbors Classification
    Samadpour, Mohammad Mehdi
    Parvin, Hamid
    Rad, Farhad
    2015 FOURTEENTH MEXICAN INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE (MICAI), 2015, : 139 - 144