Evaluating the performance of bagging-based k-nearest neighbor ensemble with the voting rule selection method

被引:0
|
作者
M. S. Suchithra
Maya L. Pai
机构
[1] Amrita School of Arts and Sciences,Department of Computer Science and IT
[2] Amrita Vishwa Vidyapeetham,undefined
来源
关键词
Label ranking; Ensembles; K-nearest neighbor label ranker; Rank aggregation; VRS;
D O I
暂无
中图分类号
学科分类号
摘要
Label ranking based prediction problems help to find a mapping between instances and ranked labels. To improve the prediction performance of label ranking models, the main suggestion is to use ensembles. The main feature of ensemble models is the ability to retrieve the outcomes of various multiple simple models and join them into a combined outcome. To ensure the property of ensemble learning, the nearest neighbor estimation is applied with a bagging approach that samples the instances of training data. The reason to select the label ranking ensemble approach is shown in this study using detailed analysis of suitable existing algorithms. The results show that the defined parameters used in the k-nearest label ranker help to obtain a better prediction performance than the existing label ranking ensembles with an accuracy of 85% to 99% on 21 label ranking datasets. But there is a possibility to improve any ensemble algorithm by using the voting rule selection procedure. This study, integrating the Voting Rule Selector (VRS) algorithm and the seven commonly used voting rules with the k-nearest neighbor label ranker, found that VRS and Copeland work more efficiently than the Borda aggregation in dataset level learning. The results indicate that the k-nearest neighbor label ranker with the VRS and Copeland aggregation method is ranked first in most of the datasets. In the dataset level, the VRS method obtained an average improvement of 48.02% in comparison to the simple model k-nearest neighbor approach and is almost equal to the Copeland with 47.84%.
引用
收藏
页码:20741 / 20762
页数:21
相关论文
共 50 条
  • [1] Evaluating the performance of bagging-based k-nearest neighbor ensemble with the voting rule selection method
    Suchithra, M. S.
    Pai, Maya L.
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (15) : 20741 - 20762
  • [2] A Novel Weighted Voting for K-Nearest Neighbor Rule
    Gou, Jianping
    Xiong, Taisong
    Kuang, Yin
    [J]. JOURNAL OF COMPUTERS, 2011, 6 (05) : 833 - 840
  • [3] K-Nearest Neighbor based Bagging SVM Pruning
    Ye, Ren
    Le, Zhang
    Suganthan, P. N.
    [J]. PROCEEDINGS OF THE 2013 IEEE SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND ENSEMBLE LEARNING (CIEL), 2013, : 25 - 30
  • [4] A novel ensemble method for k-nearest neighbor
    Zhang, Youqiang
    Cao, Guo
    Wang, Bisheng
    Li, Xuesong
    [J]. PATTERN RECOGNITION, 2019, 85 : 13 - 25
  • [5] A GENERALIZED K-NEAREST NEIGHBOR RULE
    PATRICK, EA
    FISCHER, FP
    [J]. INFORMATION AND CONTROL, 1970, 16 (02): : 128 - &
  • [6] A dynamic ensemble outlier detection model based on an adaptive k-nearest neighbor rule
    Wang, Biao
    Mao, Zhizhong
    [J]. INFORMATION FUSION, 2020, 63 : 30 - 40
  • [7] A "soft" K-Nearest Neighbor voting scheme
    Mitchell, HB
    Schaefer, PA
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2001, 16 (04) : 459 - 468
  • [8] Improving K-Nearest Neighbor Rule with Dual Weighted Voting for Pattern Classification
    Gou, Jianping
    Luo, Mingying
    Xiong, Taisong
    [J]. COMPUTER SCIENCE FOR ENVIRONMENTAL ENGINEERING AND ECOINFORMATICS, PT 2, 2011, 159 : 118 - 123
  • [9] A New Feature Selection Method Based on K-Nearest Neighbor Approach
    Wang, Xianchang
    Zhang, Lishi
    Ma, Yonggang
    [J]. PROCEEDINGS OF THE 2016 7TH INTERNATIONAL CONFERENCE ON EDUCATION, MANAGEMENT, COMPUTER AND MEDICINE (EMCM 2016), 2017, 59 : 657 - 660
  • [10] Bagging-based spectral clustering ensemble selection
    Jia, Jianhua
    Xiao, Xuan
    Liu, Bingxiang
    Jiao, Licheng
    [J]. PATTERN RECOGNITION LETTERS, 2011, 32 (10) : 1456 - 1467