Feature selection for label distribution learning under feature weight view

被引:0
|
作者
Shidong Lin
Chenxi Wang
Yu Mao
Yaojin Lin
机构
[1] Minnan Normal University,School of Computer Science
[2] Minnan Normal University,Key Laboratory of Data Science and Intelligence Application
关键词
Feature selection; Label distribution learning; Feature weight; Mutual information; Label correlation;
D O I
暂无
中图分类号
学科分类号
摘要
Label Distribution Learning (LDL) is a fine-grained learning paradigm that addresses label ambiguity, yet it confronts the curse of dimensionality. Feature selection is an effective method for dimensionality reduction, and several algorithms have been proposed for LDL that tackle the problem from different views. In this paper, we propose a novel feature selection method for LDL. First, an effective LDL model is trained through a classical LDL loss function, which is composed of the maximum entropy model and KL divergence. Then, to select common and label-specific features, their weights are enhanced by l21\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$l_{21}$$\end{document}-norm and label correlation, respectively. Considering that direct constraint on the parameter by label correlation will make the label-specific features between relevant labels tend to be the same, we adopt the strategy of constraining the maximum entropy output model. Finally, the proposed method will introduce Mutual Information (MI) for the first time in the optimization model for LDL feature selection, which distinguishes similar features thus reducing the influence of redundant features. Experimental results on twelve datasets validate the effectiveness of the proposed method.
引用
收藏
页码:1827 / 1840
页数:13
相关论文
共 50 条
  • [31] Unsupervised feature selection via multiple graph fusion and feature weight learning
    Tang, Chang
    Zheng, Xiao
    Zhang, Wei
    Liu, Xinwang
    Zhu, Xinzhong
    Zhu, En
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (05)
  • [32] Distribution preserving learning for unsupervised feature selection
    Xie, Ting
    Ren, Pengfei
    Zhang, Taiping
    Tang, Yuan Yan
    NEUROCOMPUTING, 2018, 289 : 231 - 240
  • [33] Label disambiguation-based feature selection for partial label learning via fuzzy dependency and feature discernibility
    Qian, Wenbin
    Ding, Jinfei
    Li, Yihui
    Huang, Jintao
    APPLIED SOFT COMPUTING, 2024, 161
  • [34] Feature selection based on label distribution and fuzzy mutual information
    Xiong, Chuanzhen
    Qian, Wenbin
    Wang, Yinglong
    Huang, Jintao
    INFORMATION SCIENCES, 2021, 574 : 297 - 319
  • [35] Label distribution feature selection based on neighborhood rough set
    Wu, Yilin
    Guo, Wenzhong
    Lin, Yaojin
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2024, 36 (23):
  • [36] Multi-View Unsupervised Feature Selection with Adaptive Similarity and View Weight
    Hou, Chenping
    Nie, Feiping
    Tao, Hong
    Yi, Dongyun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2017, 29 (09) : 1998 - 2011
  • [37] ProLSFEO-LDL: Prototype Selection and Label- Specific Feature Evolutionary Optimization for Label Distribution Learning
    Gonzalez, Manuel
    Cano, Jose-Ramon
    Garcia, Salvador
    APPLIED SCIENCES-BASEL, 2020, 10 (09):
  • [38] Label distribution feature selection for multi-label classification with rough set
    Qian, Wenbin
    Huang, Jintao
    Wang, Yinglong
    Xie, Yonghong
    INTERNATIONAL JOURNAL OF APPROXIMATE REASONING, 2021, 128 : 32 - 55
  • [39] Sparse Matrix Feature Selection in Multi-label Learning
    Yang, Wenyuan
    Zhou, Bufang
    Zhu, William
    ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, RSFDGRC 2015, 2015, 9437 : 332 - 339
  • [40] Multi-label Feature Selection with Adaptive Subspace Learning
    Yuan, Dongjie
    Yuan, Bin
    Zhong, Yan
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, KSEM 2024, 2024, 14884 : 148 - 160