An approximated decision-theoretic algorithm for minimization of the Tversky loss under the multi-label framework

被引:0
|
作者
Pawel Trajdos
Marek Kurzynski
机构
[1] Wroclaw University of Technology,Department of Systems and Computer Networks
来源
关键词
Multi-label classification; Tversky index; F measure; Plug-in-rule classifier; structured output;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper, we addressed the problem of building a decision-theoretic classifier tailored for minimizing the Tversky loss under the framework of multi-label classification. The proposed approach is a generalization of the Dembczyński Fβ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$F_{\beta }$$\end{document} measure optimization algorithm. The introduced technique is based on a series of discrete linear approximations of the Tversky measure. The approximated criterion is then optimized using original optimization algorithm. To assess quality of classification results produced by the designed strategy and compare its outcome with the results obtained by the state-of-the-art approaches, we conducted an experimental study on 24 benchmark datasets. The investigated methods were compared with respect to eleven different quality criteria. We considered quality criteria belonging to three main groups, i.e., example-based, micro-averaged and macro-averaged. During the experimental study, we considered four testing scenarios. Two of them deal with a symmetric variant of the Tversky loss. Remaining scenarios examine asymmetric Tversky loss. The study shows that, in general, the proposed method is comparable to the Dembczyński approach. However, for both symmetric scenarios and one asymmetric scenario, the average ranks suggest that the proposed approach achieves better classification quality in terms of the example-based Tversky measure. This is an important result because the proposed method was designed to optimize the above-mentioned quality indicator. Additionally, the introduced procedure can outperform the reference methods with respect to the zero-one loss under all testing scenarios.
引用
收藏
页码:389 / 416
页数:27
相关论文
共 21 条
  • [1] An approximated decision-theoretic algorithm for minimization of the Tversky loss under the multi-label framework
    Trajdos, Pawel
    Kurzynski, Marek
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2019, 22 (02) : 389 - 416
  • [2] On label dependence and loss minimization in multi-label classification
    Dembczynski, Krzysztof
    Waegeman, Willem
    Cheng, Weiwei
    Huellermeier, Eyke
    [J]. MACHINE LEARNING, 2012, 88 (1-2) : 5 - 45
  • [3] On label dependence and loss minimization in multi-label classification
    Krzysztof Dembczyński
    Willem Waegeman
    Weiwei Cheng
    Eyke Hüllermeier
    [J]. Machine Learning, 2012, 88 : 5 - 45
  • [4] Minimization of Decision Tree Depth for Multi-label Decision Tables
    Azad, Mohammad
    Moshkov, Mikhail
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING (GRC), 2014, : 7 - 12
  • [5] The allocation of benefits under uncertainty: a decision-theoretic framework
    Abul Naga, RH
    [J]. ECONOMIC MODELLING, 2003, 20 (04) : 873 - 893
  • [6] An experimental framework for evaluating loss minimization in multi-label classification via stochastic process
    Sousa Mello, Lucas Henrique
    Varejao, Flavio M.
    Rodrigues, Alexandre L.
    [J]. COMPUTATIONAL INTELLIGENCE, 2022, 38 (02) : 641 - 666
  • [7] Feature selection for multi-label learning based on variable-degree multi-granulation decision-theoretic rough sets
    Yu, Ying
    Wan, Ming
    Qian, Jin
    Miao, Duoqian
    Zhang, Zhiqiang
    Zhao, Pengfei
    [J]. INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2024, 169
  • [8] Multi-agent probabilistic search in a sequential decision-theoretic framework
    Chung, Timothy H.
    Burdick, Joel W.
    [J]. 2008 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-9, 2008, : 146 - +
  • [9] NOVELTY DETECTION UNDER MULTI-LABEL MULTI-INSTANCE FRAMEWORK
    Lou, Qi
    Raich, Raviv
    Briggs, Forrest
    Fern, Xiaoli Z.
    [J]. 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2013,
  • [10] A Multi-label Classification Framework Using the Covering Based Decision Table
    Thanh-Huyen Pham
    Van-Tuan Phan
    Thi-Ngan Pham
    Thi-Hong Vuong
    Tri-Thanh Nguyen
    Quang-Thuy Ha
    [J]. RECENT CHALLENGES IN INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, 2022, 1716 : 462 - 476