Feature selection for label distribution learning under feature weight view

被引:0
|
作者
Shidong Lin
Chenxi Wang
Yu Mao
Yaojin Lin
机构
[1] Minnan Normal University,School of Computer Science
[2] Minnan Normal University,Key Laboratory of Data Science and Intelligence Application
关键词
Feature selection; Label distribution learning; Feature weight; Mutual information; Label correlation;
D O I
暂无
中图分类号
学科分类号
摘要
Label Distribution Learning (LDL) is a fine-grained learning paradigm that addresses label ambiguity, yet it confronts the curse of dimensionality. Feature selection is an effective method for dimensionality reduction, and several algorithms have been proposed for LDL that tackle the problem from different views. In this paper, we propose a novel feature selection method for LDL. First, an effective LDL model is trained through a classical LDL loss function, which is composed of the maximum entropy model and KL divergence. Then, to select common and label-specific features, their weights are enhanced by l21\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_{21}$$\end{document}-norm and label correlation, respectively. Considering that direct constraint on the parameter by label correlation will make the label-specific features between relevant labels tend to be the same, we adopt the strategy of constraining the maximum entropy output model. Finally, the proposed method will introduce Mutual Information (MI) for the first time in the optimization model for LDL feature selection, which distinguishes similar features thus reducing the influence of redundant features. Experimental results on twelve datasets validate the effectiveness of the proposed method.
引用
下载
收藏
页码:1827 / 1840
页数:13
相关论文
共 50 条
  • [21] Label distribution feature selection based on label-specific features
    Shu, Wenhao
    Xia, Qiang
    Qian, Wenbin
    APPLIED INTELLIGENCE, 2024, 54 (19) : 9195 - 9212
  • [22] Random forest feature selection for partial label learning
    Sun, Xianran
    Chai, Jing
    NEUROCOMPUTING, 2023, 561
  • [23] Multi-label Learning with Label-Specific Feature Selection
    Yan, Yan
    Li, Shining
    Yang, Zhe
    Zhang, Xiao
    Li, Jing
    Wang, Anyi
    Zhang, Jingyu
    NEURAL INFORMATION PROCESSING, ICONIP 2017, PT I, 2017, 10634 : 305 - 315
  • [24] Multi-View Multi-Label Learning With Sparse Feature Selection for Image Annotation
    Zhang, Yongshan
    Wu, Jia
    Cai, Zhihua
    Yu, Philip S.
    IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (11) : 2844 - 2857
  • [25] Feature selection for label distribution learning using Dempster-Shafer evidence theory
    Zhao, Zhengwei
    Wang, Rongrong
    Pang, Wei
    Li, Zhaowen
    Applied Intelligence, 2025, 55 (04)
  • [26] Multi-label feature selection via feature manifold learning and sparsity regularization
    Cai, Zhiling
    Zhu, William
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2018, 9 (08) : 1321 - 1334
  • [27] Multi-label feature selection via feature manifold learning and sparsity regularization
    Zhiling Cai
    William Zhu
    International Journal of Machine Learning and Cybernetics, 2018, 9 : 1321 - 1334
  • [28] Unsupervised feature selection via multiple graph fusion and feature weight learning
    Chang TANG
    Xiao ZHENG
    Wei ZHANG
    Xinwang LIU
    Xinzhong ZHU
    En ZHU
    Science China(Information Sciences), 2023, 66 (05) : 56 - 72
  • [29] Robust Feature Selection with Feature Correlation via Sparse Multi-Label Learning
    Cheng, Jiangjiang
    Mei, Junmei
    Zhong, Jing
    Men, Min
    Zhong, Ping
    PATTERN RECOGNITION AND IMAGE ANALYSIS, 2020, 30 (01) : 52 - 62
  • [30] Robust Feature Selection with Feature Correlation via Sparse Multi-Label Learning
    Jiangjiang Cheng
    Junmei Mei
    Jing Zhong
    Min Men
    Ping Zhong
    Pattern Recognition and Image Analysis, 2020, 30 : 52 - 62