An Improved Weighted Base Classification for Optimum Weighted Nearest Neighbor Classifiers

被引:0
|
作者
Abbas, Muhammad [1 ]
Memon, Kamran Ali [2 ]
ul Ain, Noor [3 ]
Ajebesone, Ekang Francis [3 ]
Usaid, Muhammad [4 ]
Bhutto, Zulfiqar Ali [5 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Comp Sci, Beijing, Peoples R China
[2] Beijing Univ Posts & Telecommun, Sch Elect Engn, Beijing, Peoples R China
[3] Beijing Univ Posts & Telecommun, Sch Informat & Commun Engn, Beijing, Peoples R China
[4] Mehran Univ Engn & Technol, Dept Elect Engn, Jamshoro, Pakistan
[5] Dawood Univ Engn & Technol, Karachi, Pakistan
关键词
Classification; k-Nearest Neighbor (kNN); Logistic Regression; Decision Trees; Cross-Validation; Machine-Learning (ML); SVM; random forest; improved version of k-nearest neighbor (IVkNN); !text type='Python']Python[!/text;
D O I
10.4108/eai.13-7-2018.163339
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Existing classification studies use two non-parametric classifiers- k-nearest neighbours (kNN) and decision trees, and oneparametric classifier-logistic regression, generating high accuracies. Previous research work has compared the results ofthese classifiers with training patterns of different sizes to study alcohol tests. In this paper, the Improved Version of thekNN (IVkNN) algorithm is presented which overcomes the limitation of the conventional kNN algorithm to classify winequality. The proposed method typically identifies the same number of nearest neighbours for each test example. Resultsindicate a higher Overall Accuracy (OA) that oscillates between 67% and 76%. Among the three classifiers, the leastsensitive to the training sample size was the kNN and produced the unrivalled OA, followed by sequential decision trees andlogistic regression. Based on the sample size, the proposed IVkNN model presented 80% accuracy and 0.375 root meansquare error (RMSE).
引用
收藏
页码:1 / 8
页数:8
相关论文
共 50 条
  • [1] NEAREST NEIGHBOR CLASSIFICATION WITH IMPROVED WEIGHTED DISSIMILARITY MEASURE
    Boiculese, Lucian Vasile
    Dimitriu, Gabriel
    Moscalu, Mihaela
    [J]. PROCEEDINGS OF THE ROMANIAN ACADEMY SERIES A-MATHEMATICS PHYSICS TECHNICAL SCIENCES INFORMATION SCIENCE, 2009, 10 (02): : 205 - 213
  • [2] An Improved Adaptive Weighted Gaussian Nearest Neighbor Classification Method
    Yue, Yanna
    Shen, Jinyuan
    Liu, Runjie
    [J]. PROCEEDINGS OF THE 2019 31ST CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2019), 2019, : 2712 - 2715
  • [3] Weighted Nearest Neighbor Classification via Maximizing Classification Consistency
    Zhu, Pengfei
    Hu, Qinghua
    Yang, Yongbin
    [J]. ROUGH SETS AND CURRENT TRENDS IN COMPUTING, PROCEEDINGS, 2010, 6086 : 347 - 355
  • [4] AN EFFICIENT PATTERN CLASSIFICATION APPROACH: COMBINATION OF WEIGHTED LDA WITH WEIGHTED NEAREST NEIGHBOR
    Boostani, Reza
    Dehzangi, Omid
    Jarchi, Delaram
    Zolghadri, Mansoor J.
    [J]. NEURAL NETWORK WORLD, 2010, 20 (05) : 621 - 635
  • [5] Weighted Generalized Nearest Neighbor for Hyperspectral Image Classification
    Bo, Chunjuan
    Lu, Huchuan
    Wang, Dong
    [J]. IEEE ACCESS, 2017, 5 : 1496 - 1509
  • [6] Improving nearest neighbor classification with cam weighted distance
    Zhou, CY
    Chen, YQ
    [J]. PATTERN RECOGNITION, 2006, 39 (04) : 635 - 645
  • [7] Nearest neighbor classification using cam weighted distance
    Zhou, CY
    Chen, YQ
    [J]. FUZZY SYSTEMS AND KNOWLEDGE DISCOVERY, PT 2, PROCEEDINGS, 2005, 3614 : 100 - 109
  • [8] Comparison of Accuracy Estimation for Weighted k-Nearest Neighbor Classifiers
    Zhao, Ming
    Chen, Jingchao
    Xu, Mengyao
    [J]. FUZZY SYSTEMS AND DATA MINING V (FSDM 2019), 2019, 320 : 783 - 791
  • [9] OPTIMAL WEIGHTED NEAREST NEIGHBOUR CLASSIFIERS
    Samworth, Richard J.
    [J]. ANNALS OF STATISTICS, 2012, 40 (05): : 2733 - 2763
  • [10] WEIGHTED NEAREST-NEIGHBOR ANALYSIS
    SCHWARZBACH, E
    [J]. BIOMETRICS, 1985, 41 (04) : 1088 - 1088