Cohen's Kappa Coefficient as a Performance Measure for Feature Selection

被引:158
|
作者
Vieira, Susana M. [1 ]
Kaymak, Uzay [2 ]
Sousa, Joao M. C. [1 ]
机构
[1] Univ Tecn Lisboa, Inst Super Tecn, Dept Mech Engn, CIS IDMEC LAETA, Av Rovisco Pais, P-1049001 Lisbon, Portugal
[2] Erasmus Univ, Inst Econometr, Erasmus Sch Econ, Rotterdam, Netherlands
来源
2010 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE 2010) | 2010年
关键词
D O I
10.1109/FUZZY.2010.5584447
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Measuring the performance of a given classifier is not a straightforward or easy task. Depending on the application, the overall classification rate may not be sufficient if one, or more, of the classes fail in prediction. This problem is also reflected in the feature selection process, especially when a wrapper method is used. Cohen's kappa coefficient is a statistical measure of inter-rater agreement for qualitative items. It is generally thought to be a more robust measure than simple percent agreement calculation, since it takes into account the agreement occurring by chance. Considering that kappa is a more conservative measure, then its use in wrapper feature selection is suitable to test the performance of the models. This paper proposes the use of the kappa measure as an evaluation measure in a feature selection wrapper approach. In the proposed approach, fuzzy models are used to test the feature subsets and fuzzy criteria are used to formulate the feature selection problem. Results show that using the kappa measure leads to more accurate classifiers, and therefore it leads to feature subset solutions with more relevant features.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] Reducing feature selection bias using a model independent performance measure
    Ni, Weizeng
    Xu, Nuo
    Dai, Honghao
    Huang, Samuel H.
    International Journal of Data Science, 2020, 5 (03) : 229 - 246
  • [22] An evaluation of classifier-specific filter measure performance for feature selection
    Freeman, Cecille
    Kulic, Dana
    Basir, Otman
    PATTERN RECOGNITION, 2015, 48 (05) : 1812 - 1826
  • [23] Application of the Hypothesis Analysis Method Using Cohen's Kappa Index to Measure the Agreement between Leather Sorters
    Casey, Patricia
    Altobelli, Gustavo
    Pignatelli, Pablo
    XXX CONGRESS OF THE INTERNATIONAL UNION OF LEATHER TECHNOLOGISTS & CHEMISTS SOCIETIES, PROCEEDINGS, 2009, : 11 - 17
  • [24] Weighted kappa is higher than Cohen's kappa for tridiagonal agreement tables
    Warrens, Matthijs J.
    STATISTICAL METHODOLOGY, 2011, 8 (02) : 268 - 272
  • [25] Application of the Hypothesis Analysis Method Using Cohen's Kappa Index to Measure the Agreement Between Leather Sorters
    Casey, Patricia
    Altobelli, Gustavo
    Pignatelli, Pablo
    JOURNAL OF THE SOCIETY OF LEATHER TECHNOLOGISTS AND CHEMISTS, 2010, 94 (04): : 144 - 148
  • [26] Cohen’s linearly weighted kappa is a weighted average
    Matthijs J. Warrens
    Advances in Data Analysis and Classification, 2012, 6 : 67 - 79
  • [27] Cohen's linearly weighted kappa is a weighted average
    Warrens, Matthijs J.
    ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2012, 6 (01) : 67 - 79
  • [28] Sample-size calculations for Cohen's kappa
    Cantor, AB
    PSYCHOLOGICAL METHODS, 1996, 1 (02) : 150 - 153
  • [29] The dependence of Cohen's kappa on the prevalence does not matter
    Vach, W
    JOURNAL OF CLINICAL EPIDEMIOLOGY, 2005, 58 (07) : 655 - 661
  • [30] A Formal Proof of a Paradox Associated with Cohen's Kappa
    Warrens, Matthijs J.
    JOURNAL OF CLASSIFICATION, 2010, 27 (03) : 322 - 332