Label Embedding for Multi-label Classification Via Dependence Maximization

被引:0
|
作者
Yachong Li
Youlong Yang
机构
[1] Xidian University,School of Mathematics and Statistics
来源
Neural Processing Letters | 2020年 / 52卷
关键词
Multi-label learning; Label embedding; Low-rank factorization; Hilbert–Schmidt independence criterion; Missing labels;
D O I
暂无
中图分类号
学科分类号
摘要
Multi-label classification has aroused extensive attention in various fields. With the emergence of high-dimensional label space, academia has devoted to performing label embedding in recent years. Whereas current embedding approaches do not take feature space correlation sufficiently into consideration or require an encoding function while learning embedded space. Besides, few of them can be spread to track the missing labels. In this paper, we propose a Label Embedding method via Dependence Maximization (LEDM), which obtains the latent space on which the label and feature information can be embedded simultaneously. To end this, the low-rank factorization model on the label matrix is applied to exploit label correlations instead of the encoding process. The dependence between feature space and label space is increased by the Hilbert–Schmidt independence criterion to facilitate the predictability. The proposed LEDM can be easily extended the missing labels in learning embedded space at the same time. Comprehensive experimental results on data sets validate the effectiveness of our approach over the state-of-art methods on both complete-label and missing-label cases.
引用
收藏
页码:1651 / 1674
页数:23
相关论文
共 50 条
  • [1] Label Embedding for Multi-label Classification Via Dependence Maximization
    Li, Yachong
    Yang, Youlong
    [J]. NEURAL PROCESSING LETTERS, 2020, 52 (02) : 1651 - 1674
  • [2] Dependence maximization based label space dimension reduction for multi-label classification
    Zhang, Ju-Jie
    Fang, Min
    Wang, Hongchun
    Li, Xiao
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2015, 45 : 453 - 463
  • [3] A Label Embedding Method for Multi-label Classification via Exploiting Local Label Correlations
    Wang, Xidong
    Li, Jun
    Xu, Jianhua
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2019, PT V, 2019, 1143 : 168 - 180
  • [4] Group preserving label embedding for multi-label classification
    Kumar, Vikas
    Pujari, Arun K.
    Padmanabhan, Vineet
    Kagita, Venkateswara Rao
    [J]. PATTERN RECOGNITION, 2019, 90 : 23 - 34
  • [5] Multi-label text classification via joint learning from label embedding and label correlation
    Liu, Huiting
    Chen, Geng
    Li, Peipei
    Zhao, Peng
    Wu, Xindong
    [J]. NEUROCOMPUTING, 2021, 460 : 385 - 398
  • [6] On label dependence and loss minimization in multi-label classification
    Dembczynski, Krzysztof
    Waegeman, Willem
    Cheng, Weiwei
    Huellermeier, Eyke
    [J]. MACHINE LEARNING, 2012, 88 (1-2) : 5 - 45
  • [7] On label dependence and loss minimization in multi-label classification
    Krzysztof Dembczyński
    Willem Waegeman
    Weiwei Cheng
    Eyke Hüllermeier
    [J]. Machine Learning, 2012, 88 : 5 - 45
  • [8] Cost-sensitive label embedding for multi-label classification
    Huang, Kuan-Hao
    Lin, Hsuan-Tien
    [J]. MACHINE LEARNING, 2017, 106 (9-10) : 1725 - 1746
  • [9] Cost-sensitive label embedding for multi-label classification
    Kuan-Hao Huang
    Hsuan-Tien Lin
    [J]. Machine Learning, 2017, 106 : 1725 - 1746
  • [10] Multi-label feature selection via manifold regularization and dependence maximization
    Huang, Rui
    Wu, Zhejun
    [J]. Pattern Recognition, 2021, 120