Progressive random k-labelsets for cost-sensitive multi-label classification

被引:0
|
作者
Yu-Ping Wu
Hsuan-Tien Lin
机构
[1] National Taiwan University,
来源
Machine Learning | 2017年 / 106卷
关键词
Machine learning; Multi-label classification; Loss function; Cost-sensitive learning; Labelset; Ensemble method;
D O I
暂无
中图分类号
学科分类号
摘要
In multi-label classification, an instance is associated with multiple relevant labels, and the goal is to predict these labels simultaneously. Many real-world applications of multi-label classification come with different performance evaluation criteria. It is thus important to design general multi-label classification methods that can flexibly take different criteria into account. Such methods tackle the problem of cost-sensitive multi-label classification (CSMLC). Most existing CSMLC methods either suffer from high computational complexity or focus on only certain specific criteria. In this work, we propose a novel CSMLC method, named progressive random k-labelsets (PRAkEL), to resolve the two issues above. The method is extended from a popular multi-label classification method, random k-labelsets, and hence inherits its efficiency. Furthermore, the proposed method can handle arbitrary example-based evaluation criteria by progressively transforming the CSMLC problem into a series of cost-sensitive multi-class classification problems. Experimental results demonstrate that PRAkEL is competitive with existing methods under the specific criteria they can optimize, and is superior under other criteria.
引用
收藏
页码:671 / 694
页数:23
相关论文
共 50 条
  • [1] Progressive random k-labelsets for cost-sensitive multi-label classification
    Wu, Yu-Ping
    Lin, Hsuan-Tien
    [J]. MACHINE LEARNING, 2017, 106 (05) : 671 - 694
  • [2] Generalized k-Labelsets Ensemble for Multi-Label and Cost-Sensitive Classification
    Lo, Hung-Yi
    Lin, Shou-De
    Wang, Hsin-Min
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2014, 26 (07) : 1679 - 1691
  • [3] Calibrated k-labelsets for Ensemble Multi-label Classification
    Gharroudi, Ouadie
    Elghazel, Haytham
    Aussem, Alex
    [J]. NEURAL INFORMATION PROCESSING, PT I, 2015, 9489 : 573 - 582
  • [4] Fast Random k-Labelsets for Large-Scale Multi-Label Classification
    Kimura, Keigo
    Kudo, Mineichi
    Sun, Lu
    Koujaku, Sadamori
    [J]. 2016 23RD INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2016, : 438 - 443
  • [5] Active k-labelsets ensemble for multi-label classification
    Wang, Ran
    Kwong, Sam
    Wang, Xu
    Jia, Yuheng
    [J]. PATTERN RECOGNITION, 2021, 109
  • [6] Correlation-Based Weighted K-Labelsets for Multi-label Classification
    Xu, Jingyang
    Ma, Jun
    [J]. WEB TECHNOLOGIES AND APPLICATIONS, PT I, 2016, 9931 : 408 - 419
  • [7] Mutual Information Based K-Labelsets Ensemble for Multi-Label Classification
    Wang, Ran
    Kwong, Sam
    Jia, Yuheng
    Huang, Zhiqi
    Wu, Lang
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE), 2018,
  • [8] Local positive and negative correlation-based k-labelsets for multi-label classification
    Nan, Guofang
    Li, Qiwang
    Dou, Runliang
    Liu, Jing
    [J]. NEUROCOMPUTING, 2018, 318 : 90 - 101
  • [9] Cost-sensitive label embedding for multi-label classification
    Huang, Kuan-Hao
    Lin, Hsuan-Tien
    [J]. MACHINE LEARNING, 2017, 106 (9-10) : 1725 - 1746
  • [10] Cost-sensitive label embedding for multi-label classification
    Kuan-Hao Huang
    Hsuan-Tien Lin
    [J]. Machine Learning, 2017, 106 : 1725 - 1746