The nearest neighbor algorithm of local probability centers

被引:77
|
作者
Li, Boyu [1 ]
Chen, Yun Wen [1 ]
Chen, Yan Qiu [1 ]
机构
[1] Fudan Univ, Dept Comp Sci & Engn, Sch Informat Sci & Engn, Shanghai 200433, Peoples R China
基金
中国国家自然科学基金; 高等学校博士学科点专项科研基金;
关键词
nearest center; nearest neighbor; negative-contributing sample (NCS); pattern classification; probability mean;
D O I
10.1109/TSMCB.2007.908363
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
When classes are nonseparable or overlapping, training samples in a local neighborhood may come from different classes. In this situation, the samples with different class labels may be comparable in the neighborhood of query. As a consequence, the conventional nearest neighbor classifier, such as k-nearest neighbor scheme, may produce a wrong prediction. To address this issue, in this paper, we propose a new classification method, which performs a classification Mask based on the local probabilistic centers of each class. This method works by reducing the number of negative contributing points, which are the known samples falling on the wrong side of the ideal decision boundary, in a training set and by restricting their influence regions. In classification, this method classifies the query sample by using two measures of which one is the distance between the query and the local categorical probability centers, and the other is the computed posterior probability of the query. Although both measures are effective, the experiments show that the second one achieves the smaller classification error. Meanwhile, the theoretical analyses of the suggested methods are investigated., and some experiments are conducted on the basis of both constructed and real datasets. The investigation results show that this method substantially improves the classification performance of the nearest neighbor algorithm.
引用
收藏
页码:141 / 154
页数:14
相关论文
共 50 条
  • [41] Modeling the Nearest Neighbor Graphs to Estimate the Probability of the Independence of Data
    Kislitsyn A.A.
    Mathematical Models and Computer Simulations, 2023, 15 (Suppl 1) : S41 - S53
  • [42] K-local hyperplane distance nearest-neighbor algorithm and protein fold recognition
    Okun O.G.
    Pattern Recognition and Image Analysis, 2006, 16 (1) : 19 - 22
  • [43] Comparative Analysis of K-Nearest Neighbor and Modified K-Nearest Neighbor Algorithm for Data Classification
    Okfalisa
    Mustakim
    Gazalba, Ikbal
    Reza, Nurul Gayatri Indah
    2017 2ND INTERNATIONAL CONFERENCES ON INFORMATION TECHNOLOGY, INFORMATION SYSTEMS AND ELECTRICAL ENGINEERING (ICITISEE): OPPORTUNITIES AND CHALLENGES ON BIG DATA FUTURE INNOVATION, 2017, : 294 - 298
  • [44] Local Naive Bayes Nearest Neighbor for Image Classification
    McCann, Sancho
    Lowe, David G.
    2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2012, : 3650 - 3656
  • [45] Generative Local Metric Learning for Nearest Neighbor Classification
    Noh, Yung-Kyun
    Zhang, Byoung-Tak
    Lee, Daniel D.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (01) : 106 - 118
  • [46] Local polynomial metrics for K nearest neighbor classifiers
    Snapp, RR
    UNCERTAINTY IN GEOMETRIC COMPUTATIONS, 2002, 704 : 155 - 164
  • [47] Data compression and local metrics for nearest neighbor classification
    Ricci, F
    Avesani, P
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1999, 21 (04) : 380 - 384
  • [48] Nearest Neighbor Method Based on Local Distribution for Classification
    Mao, Chengsheng
    Hu, Bin
    Moore, Philip
    Su, Yun
    Wang, Manman
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART I, 2015, 9077 : 239 - 250
  • [49] Distributed adaptive nearest neighbor classifier: algorithm and theory
    Liu, Ruiqi
    Xu, Ganggang
    Shang, Zuofeng
    STATISTICS AND COMPUTING, 2023, 33 (05)
  • [50] DISCONA: distributed sample compression for nearest neighbor algorithm
    Rybicki, Jedrzej
    Frenklach, Tatiana
    Puzis, Rami
    APPLIED INTELLIGENCE, 2023, 53 (17) : 19976 - 19989