Feature selection for label distribution learning under feature weight view

被引:0
|
作者
Shidong Lin
Chenxi Wang
Yu Mao
Yaojin Lin
机构
[1] Minnan Normal University,School of Computer Science
[2] Minnan Normal University,Key Laboratory of Data Science and Intelligence Application
关键词
Feature selection; Label distribution learning; Feature weight; Mutual information; Label correlation;
D O I
暂无
中图分类号
学科分类号
摘要
Label Distribution Learning (LDL) is a fine-grained learning paradigm that addresses label ambiguity, yet it confronts the curse of dimensionality. Feature selection is an effective method for dimensionality reduction, and several algorithms have been proposed for LDL that tackle the problem from different views. In this paper, we propose a novel feature selection method for LDL. First, an effective LDL model is trained through a classical LDL loss function, which is composed of the maximum entropy model and KL divergence. Then, to select common and label-specific features, their weights are enhanced by l21\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_{21}$$\end{document}-norm and label correlation, respectively. Considering that direct constraint on the parameter by label correlation will make the label-specific features between relevant labels tend to be the same, we adopt the strategy of constraining the maximum entropy output model. Finally, the proposed method will introduce Mutual Information (MI) for the first time in the optimization model for LDL feature selection, which distinguishes similar features thus reducing the influence of redundant features. Experimental results on twelve datasets validate the effectiveness of the proposed method.
引用
收藏
页码:1827 / 1840
页数:13
相关论文
共 50 条
  • [1] Feature selection for label distribution learning under feature weight view
    Lin, Shidong
    Wang, Chenxi
    Mao, Yu
    Lin, Yaojin
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (05) : 1827 - 1840
  • [2] Feature selection for label distribution learning via feature similarity and label correlation
    Qian, Wenbin
    Xiong, Yinsong
    Yang, Jun
    Shu, Wenhao
    [J]. INFORMATION SCIENCES, 2022, 582 : 38 - 59
  • [3] Hierarchical Feature Selection Based on Label Distribution Learning
    Lin, Yaojin
    Liu, Haoyang
    Zhao, Hong
    Hu, Qinghua
    Zhu, Xingquan
    Wu, Xindong
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (06) : 5964 - 5976
  • [4] Label distribution feature selection with feature weights fusion and local label correlations
    Qian, Wenbin
    Ye, Qianzhi
    Li, Yihui
    Dai, Shiming
    [J]. Knowledge-Based Systems, 2022, 256
  • [5] Label distribution feature selection with feature weights fusion and local label correlations
    Qian, Wenbin
    Ye, Qianzhi
    Li, Yihui
    Dai, Shiming
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 256
  • [6] Multi-label feature selection based on label distribution and feature complementarity
    Qian, Wenbin
    Long, Xuandong
    Wang, Yinglong
    Xie, Yonghong
    [J]. APPLIED SOFT COMPUTING, 2020, 90
  • [7] Feature Selection for Multi-Label Learning
    Spolaor, Newton
    Monard, Maria Carolina
    Lee, Huei Diana
    [J]. PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 4401 - 4402
  • [8] Submodular Feature Selection for Partial Label Learning
    Bao, Wei-Xuan
    Hang, Jun-Yi
    Zhang, Min-Ling
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 26 - 34
  • [9] Embedded feature fusion for multi-view multi-label feature selection
    Hao, Pingting
    Gao, Wanfu
    Hu, Liang
    [J]. PATTERN RECOGNITION, 2025, 157
  • [10] Filling Missing Labels in Label Distribution Learning by Exploiting Label-Specific Feature Selection
    Li, Weiwei
    Chen, Jin
    Lu, Yuqing
    Huang, Zhiqiu
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,