Reference set thinning for the k-nearest neighbor decision rule

被引:0
|
作者
Bhattacharya, B [1 ]
Kaller, D [1 ]
机构
[1] Simon Fraser Univ, Sch Comp Sci, Burnaby, BC V5A 1S6, Canada
关键词
nearest neighbor rule; Voronoi diagram; Delaunay graph; Gabriel graph;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The k-nearest neighbor decision rule (or k-NNR) is used to classify a point in d-space according to the dominant class among its k nearest neighbors in some reference set (in which each point has a known class). It is useful to find a small subset S' of S that can be used as the reference set instead. If the k-NNR always makes the same decision using either S or S' as the reference set, then S' is called all exact thinning of S far the k-NNR. III this paper; we show that such art exact thinning can be determined easily from the k-Delaunay graph of S (which is dual to the order-k Voronoi diagram of S). This graph "encodes" a particular. subset of S that must be included within any exact thinning for the k-NNR, and it also provides information on how this subset can be augmented into an exact thinning (although perhaps not a minimum one). In addition, we investigate how the k-Gabriel graph (which is a subgraph of the k-Delaunay graph) can be risen to del-ive an inexact thinning of S that performs well in practice far the k-NNR. It is advantageous to use the k-Gabriel graph instead of the k-Delaunay graph, because the k-Gabriel graph is smaller and much easier to compute from the point set S.
引用
收藏
页码:238 / 242
页数:5
相关论文
共 50 条
  • [1] A GENERALIZED K-NEAREST NEIGHBOR RULE
    PATRICK, EA
    FISCHER, FP
    INFORMATION AND CONTROL, 1970, 16 (02): : 128 - &
  • [2] Clustering-based reference set reduction for k-nearest neighbor
    Hwang, Seongseob
    Cho, Sungzoon
    ADVANCES IN NEURAL NETWORKS - ISNN 2007, PT 2, PROCEEDINGS, 2007, 4492 : 880 - +
  • [3] GENERALIZED K-NEAREST NEIGHBOR DECISION RULE FOR ISOLATED WORD RECOGNITION
    LEVINSON, SE
    JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 1978, 64 : S180 - S180
  • [4] A Proposal for Local k Values for k-Nearest Neighbor Rule
    Garcia-Pedrajas, Nicolas
    Romero del Castillo, Juan A.
    Cerruela-Garcia, Gonzalo
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (02) : 470 - 475
  • [5] A Novel Weighted Voting for K-Nearest Neighbor Rule
    Gou, Jianping
    Xiong, Taisong
    Kuang, Yin
    JOURNAL OF COMPUTERS, 2011, 6 (05) : 833 - 840
  • [6] Estimating the posterior probabilities using the K-nearest neighbor rule
    Atiya, AF
    NEURAL COMPUTATION, 2005, 17 (03) : 731 - 740
  • [7] Fuzzy Monotonic K-Nearest Neighbor Versus Monotonic Fuzzy K-Nearest Neighbor
    Zhu, Hong
    Wang, Xizhao
    Wang, Ran
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2022, 30 (09) : 3501 - 3513
  • [8] Comparative Analysis of K-Nearest Neighbor and Modified K-Nearest Neighbor Algorithm for Data Classification
    Okfalisa
    Mustakim
    Gazalba, Ikbal
    Reza, Nurul Gayatri Indah
    2017 2ND INTERNATIONAL CONFERENCES ON INFORMATION TECHNOLOGY, INFORMATION SYSTEMS AND ELECTRICAL ENGINEERING (ICITISEE): OPPORTUNITIES AND CHALLENGES ON BIG DATA FUTURE INNOVATION, 2017, : 294 - 298
  • [9] A new edited k-nearest neighbor rule in the pattern classification problem
    Hattori, K
    Takahashi, M
    PATTERN RECOGNITION, 2000, 33 (03) : 521 - 528
  • [10] Fault Isolation Based on k-Nearest Neighbor Rule for Industrial Processes
    Zhou, Zhe
    Wen, Chenglin
    Yang, Chunjie
    IEEE TRANSACTIONS ON INDUSTRIAL ELECTRONICS, 2016, 63 (04) : 2578 - 2586