In-Depth Research and Analysis of Multilabel Learning Algorithm

被引:0
|
作者
Li, Daowang [1 ]
Wang, Canwei [1 ]
机构
[1] Shandong Management Univ, Dept Informat & Engn, Jinan 250357, Peoples R China
关键词
Learning algorithms;
D O I
10.1155/2022/7737166
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Multilabel learning (MLL), as a hot topic in the field of machine learning, has attracted wide attention from many scholars due to its ability to express output space polysemy. In recent years, a large number of achievements about MLL have emerged. Among these achievements, there are several typical issues worthy of attention. Firstly, the correlation among labels plays a key role in improving MLL model training process. Many MLL algorithms try to fully and effectively use the correlation among labels to improve the performance. Secondly, existing MLL evaluation metrics, which is different from those in binary classification, often reflects the generalization performance of MLL classifiers in some aspects. How to choose metrics in algorithms to improve their generalization performance and fairness is another issue that should be concerned. Thirdly, in many practical MLL applications, there are many unlabeled instances due to their labeling cost in training datasets. How to use the wealth information contained in the correlation among unlabeled instances may contribute to reducing of the labeling cost in MLL and improving performance. Fourthly, labels assigned to instances may not be equally descriptive in many applications. How to describe the importance of each label in output space to an instance has become one of research points that many scholars have paid attention to in recent years. This paper reviews the MLL-related research results of correlation among labels, evaluation metric, multilabel semisupervised learning, and label distribution learning (LDL) from a theoretical and algorithmic perspective. Finally, the related research work on MLL is summarized and discussed.
引用
收藏
页数:16
相关论文
共 50 条
  • [2] In-depth analysis
    Wilks, N
    [J]. PROFESSIONAL ENGINEERING, 2000, 13 (06) : 20 - 21
  • [3] Research on the Structure of Smart Power Plant Based on In-depth Learning
    Lu, Ruonan
    [J]. 2019 5TH INTERNATIONAL CONFERENCE ON ENVIRONMENTAL SCIENCE AND MATERIAL APPLICATION, 2020, 440
  • [4] In-depth analysis of SVM kernel learning and its components
    Roman, Ibai
    Santana, Roberto
    Mendiburu, Alexander
    Lozano, Jose A.
    [J]. NEURAL COMPUTING & APPLICATIONS, 2021, 33 (12): : 6575 - 6594
  • [5] An in-depth analysis of the Learning and Study Strategies Inventory (LASSI)
    Cano, Francisco
    [J]. EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2006, 66 (06) : 1023 - 1038
  • [6] In-depth analysis of SVM kernel learning and its components
    Ibai Roman
    Roberto Santana
    Alexander Mendiburu
    Jose A. Lozano
    [J]. Neural Computing and Applications, 2021, 33 : 6575 - 6594
  • [7] Semi-unsupervised Learning: An In-depth Parameter Analysis
    Davidson, Padraig
    Buckermann, Florian
    Steininger, Michael
    Krause, Anna
    Hotho, Andreas
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE, KI 2021, 2021, 12873 : 51 - 66
  • [8] In-depth Evaluation and Experimental Analysis of a Weight Pruning Genetic Algorithm
    Janjic, Sasa
    Thulasiraman, Parimala
    Bruce, Neil
    [J]. 2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 1814 - 1821
  • [9] In-depth problems for collaborative learning
    Price, RH
    [J]. CHANGING ROLE OF PHYSICS DEPARTMENTS IN MODERN UNIVERSITIES - PROCEEDINGS OF INTERNATIONAL CONFERENCE ON UNDERGRADUATE PHYSICS EDUCATION, PTS 1 AND 2, 1997, (399): : 831 - 832
  • [10] In-depth Research to Serve Practice
    van Oosterom, Peter
    [J]. GIM INTERNATIONAL-THE WORLDWIDE MAGAZINE FOR GEOMATICS, 2013, 27 (11): : 14 - +