Feature weighting to tackle label dependencies in multi-label stacking nearest neighbor

被引:1
|
作者
Rastin, Niloofar [1 ]
Jahromi, Mansoor Zolghadri [1 ]
Taheri, Mohammad [1 ]
机构
[1] Shiraz Univ, Sch Elect & Comp Engn, Shiraz, Iran
关键词
Multi-label classification; Stacking; Meta binary relevance; Label correlations; Nearest neighbor; Feature weighting; CLASSIFICATION; COST;
D O I
10.1007/s10489-020-02073-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In multi-label learning, each instance is associated with a subset of predefined labels. One common approach for multi-label classification has been proposed in Godbole and Sarawagi (2004) based on stacking which is called as Meta Binary Relevance (MBR). It uses two layers of binary models and feeds the outputs of the first layer to all binary models of the second layer. Hence, initial predicted class labels (in the first layer) are attached to the original features to have a new prediction of the classes in the second layer. To predict a specific label in the second layer, irrelevant labels are also used as the noisy features. This is why; Nearest Neighbor (NN) as a sensitive classifier to noisy features had been not, up to now, a proper base classifier in stacking method and all of its merits including simplicity, interpretability, global stability to noisy labels and good performance, are lost. As the first contribution, a popular feature weighting in NN classification is used here to solve uncorrelated labels problem. It tunes a parametric distance function by gradient descent to minimize the classification error on training data. However, it is known that some other objectives including F-measure are more suitable than classification error on learning imbalanced data. The second contribution of this paper is extending this method in order to improve F-measure. In our experimental study, the proposed method has been compared with and outperforms state-of-the-art multi-label classifiers in the literature.
引用
收藏
页码:5200 / 5218
页数:19
相关论文
共 50 条
  • [1] Feature weighting to tackle label dependencies in multi-label stacking nearest neighbor
    Niloofar Rastin
    Mansoor Zolghadri Jahromi
    Mohammad Taheri
    Applied Intelligence, 2021, 51 : 5200 - 5218
  • [2] Fuzzy rough discrimination and label weighting for multi-label feature selection
    Tan, Anhui
    Liang, Jiye
    Wu, Wei-Zhi
    Zhang, Jia
    Sun, Lin
    Chen, Chao
    NEUROCOMPUTING, 2021, 465 : 128 - 140
  • [3] Dynamic feature weighting for multi-label classification problems
    Maryam Dialameh
    Ali Hamzeh
    Progress in Artificial Intelligence, 2021, 10 : 283 - 295
  • [4] Dynamic feature weighting for multi-label classification problems
    Dialameh, Maryam
    Hamzeh, Ali
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2021, 10 (03) : 283 - 295
  • [5] A Label Correlation Based Weighting Feature Selection Approach for Multi-label Data
    Liu, Lu
    Zhang, Jing
    Li, Peipei
    Zhang, Yuhong
    Hu, Xuegang
    WEB-AGE INFORMATION MANAGEMENT, PT II, 2016, 9659 : 369 - 379
  • [6] AnnexML: Approximate Nearest Neighbor Search for Extreme Multi-label Classification
    Tagami, Yukihiro
    KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, : 455 - 464
  • [7] A Locally Adaptive Multi-Label k-Nearest Neighbor Algorithm
    Wang, Dengbao
    Wang, Jingyuan
    Hu, Fei
    Li, Li
    Zhang, Xiuzhen
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2018, PT I, 2018, 10937 : 81 - 93
  • [8] A Coupled k-Nearest Neighbor Algorithm for Multi-label Classification
    Liu, Chunming
    Cao, Longbing
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PART I, 2015, 9077 : 176 - 187
  • [9] Recursive Nearest Neighbor Graph Partitioning for Extreme Multi-Label Learning
    Tagami, Yukihiro
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2019, E102D (03) : 579 - 587
  • [10] A k-nearest neighbor based algorithm for multi-label classification
    Zhang, ML
    Zhou, ZH
    2005 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING, VOLS 1 AND 2, 2005, : 718 - 721