Using Dominant Sets for k-NN Prototype Selection

被引:0
|
作者
Vascon, Sebastiano [1 ]
Cristani, Marco [1 ]
Pelillo, Marcello [2 ]
Murino, Vittorio [1 ]
机构
[1] Ist Italiano Tecnol, Pattern Anal & Comp Vis PAVIS, Via Morego 30, I-16163 Genoa, Italy
[2] Univ Cafoscari Venice, DAIS, I-30172 Venice, Italy
来源
IMAGE ANALYSIS AND PROCESSING (ICIAP 2013), PT II | 2013年 / 8157卷
关键词
K-nearest neighbors; Prototype selection; Classification; Dominant set; Data reduction;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
k-Nearest Neighbors is surely one of the most important and widely adopted non-parametric classification methods in pattern recognition. It has evolved in several aspects in the last 50 years, and one of the most known variants consists in the usage of prototypes: a prototype distills a group of similar training points, diminishing drastically the number of comparisons needed for the classification; actually, prototypes are employed in the case the cardinality of the training data is high. In this paper, by using the dominant set clustering framework, we propose four novel strategies for the prototype generation, allowing to produce representative prototypes that mirror the underlying class structure in an expressive and effective way. Our strategy boosts the k-NN classification performance; considering heterogeneous metrics and analyzing 15 diverse datasets, we are among the best 6 prototype-based k-NN approaches, with a computational cost which is strongly inferior to all the competitors. In addition, we show that our proposal beats linear SVM in the case of a pedestrian detection scenario.
引用
收藏
页码:131 / 140
页数:10
相关论文
共 50 条
  • [1] Feature Selection by Using DE Algorithm and k-NN Classifier
    Senel, Fatih Ahmet
    Yuksel, Asim Sinan
    Yigit, Tuncay
    ARTIFICIAL INTELLIGENCE AND APPLIED MATHEMATICS IN ENGINEERING PROBLEMS, 2020, 43 : 886 - 893
  • [2] An automatic selection method of k in k-NN classifier
    Du, L. (dulei.323@stu.xjtu.edu.cn), 2013, Northeast University (28):
  • [3] DBSCAN Parameter Selection Based on K-NN
    Delgado, Leonardo
    Morales, Eduardo F.
    ADVANCES IN COMPUTATIONAL INTELLIGENCE (MICAI 2021), PT I, 2021, 13067 : 187 - 198
  • [4] Fast k-NN search using pre-computed l-NN sets
    Yoo, Sanghyun
    Lee, Ki Yong
    Kim, Myoung Ho
    COMPUTER SYSTEMS SCIENCE AND ENGINEERING, 2011, 26 (04): : 231 - 240
  • [5] Input and structure selection for k-NN approximator
    Sorjamaa, A
    Reyhani, N
    Lendasse, A
    COMPUTATIONAL INTELLIGENCE AND BIOINSPIRED SYSTEMS, PROCEEDINGS, 2005, 3512 : 985 - 992
  • [6] Using rough sets to edit training set in k-NN method.
    Caballero, Y
    Bello, R
    Garcia, MM
    Pizano, Y
    Joseph, S
    Lezcano, Y
    5TH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, PROCEEDINGS, 2005, : 456 - 461
  • [7] Exploring the Feature Selection of the EEG Signal Time and Frequency Domain Features for k-NN and Weighted k-NN
    Diah, Theresia K.
    Faqih, Akhmad
    Kusumoputro, Benyamin
    PROCEEDINGS OF 2019 IEEE R10 HUMANITARIAN TECHNOLOGY CONFERENCE (IEEE R10 HTC 2019), 2019, : 196 - 199
  • [8] Selection of Relevant Features for Text Classification with K-NN
    Balicki, Jerzy
    Krawczyk, Henryk
    Rymko, Lukasz
    Szymanski, Julian
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, PT II, 2013, 7895 : 477 - 488
  • [9] Efficient Selection Algorithm for Fast k-NN Search on GPUs
    Tang, Xiaoxin
    Huang, Zhiyi
    Eyers, David
    Mills, Steven
    Guo, Minyi
    2015 IEEE 29TH INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS), 2015, : 397 - 406
  • [10] Clustering and Principal Feature Selection Impact for Internet Traffic Classification Using K-NN
    Wiradinata, Trianggoro
    Suryaputra, P. Adi
    PROCEEDINGS OF SECOND INTERNATIONAL CONFERENCE ON ELECTRICAL SYSTEMS, TECHNOLOGY AND INFORMATION 2015 (ICESTI 2015), 2016, 365 : 75 - 81