Feature selection for label distribution learning using dual-similarity based neighborhood fuzzy entropy

被引:16
|
作者
Deng, Zhixuan [1 ,2 ]
Li, Tianrui [1 ,2 ]
Deng, Dayong [3 ]
Liu, Keyu [1 ,2 ]
Zhang, Pengfei [1 ,2 ]
Zhang, Shiming [1 ,2 ]
Luo, Zhipeng [1 ,2 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 611756, Peoples R China
[2] Mfg Ind Chains Collaborat & Informat Support Techn, Key Lab Sichuan Prov, Chengdu 611756, Peoples R China
[3] Zhejiang Normal Univ, Xingzhi Coll, Lanxi 321100, Peoples R China
基金
美国国家科学基金会; 国家重点研发计划;
关键词
Neighborhood rough sets; Fuzzy entropy; Feature selection; Label distribution learning; ATTRIBUTE REDUCTION; MUTUAL INFORMATION;
D O I
10.1016/j.ins.2022.10.054
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Label distribution learning (LDL) is a novel framework for handling label ambiguity problems and has been used widely in practice. However, dealing with high-dimensional data or data with redundant features in the LDL context is still an open problem. Existing feature selection algorithms cannot be directly applied to LDL due to the unique challenges caused by the label uncertainty nature. In this paper, we propose a novel LDL feature selection algorithm based on neighborhood rough sets. Specifically, we first introduce dualsimilarity that is used to measure sample similarity in both the feature and the label spaces. Second, we invent a novel neighborhood fuzzy entropy as a feature evaluation metric, with which neighborhood rough sets can be applied to deal with LDL problems. Lastly, we complete a feature selection model that inherits the spirit of neighborhood rough sets and neighborhood fuzzy entropy. Extensive experiments have been conducted on twelve real-world LDL datasets, and the results demonstrate the superiority of our proposed model against to other six state-of-the-art algorithms.(c) 2022 Elsevier Inc. All rights reserved.
引用
收藏
页码:385 / 404
页数:20
相关论文
共 50 条
  • [1] Feature selection for label distribution learning based on neighborhood fuzzy rough sets
    Deng, Zhixuan
    Li, Tianrui
    Zhang, Pengfei
    Liu, Keyu
    Yuan, Zhong
    Deng, Dayong
    Applied Soft Computing, 2025, 169
  • [2] Feature selection for label distribution learning via feature similarity and label correlation
    Qian, Wenbin
    Xiong, Yinsong
    Yang, Jun
    Shu, Wenhao
    INFORMATION SCIENCES, 2022, 582 : 38 - 59
  • [3] Multi-label Feature Selection Based on Fuzzy Neighborhood Similarity Relations in Double Spaces
    Xu, Jiucheng
    Shen, Kaili
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2022, 35 (09): : 805 - 815
  • [4] Feature Selection Using Neighborhood based Entropy
    Farnaghi-Zadeh, Fatemeh
    Rahmani, Mohsen
    Amiri, Maryam
    JOURNAL OF UNIVERSAL COMPUTER SCIENCE, 2022, 28 (11) : 1169 - 1192
  • [5] Feature gene selection based on fuzzy neighborhood joint entropy
    Wang, Yan
    Sun, Minjie
    Long, Linbo
    Liu, Jinhui
    Ren, Yifan
    COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (01) : 129 - 144
  • [6] Feature genes selection based on fuzzy neighborhood conditional entropy
    Xu, Jiucheng
    Wang, Yun
    Mu, Huiyu
    Huang, Fangzhou
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2019, 36 (01) : 117 - 126
  • [7] Feature gene selection based on fuzzy neighborhood joint entropy
    Yan Wang
    Minjie Sun
    Linbo Long
    Jinhui Liu
    Yifan Ren
    Complex & Intelligent Systems, 2024, 10 : 129 - 144
  • [8] Feature selection using fuzzy entropy measures with similarity classifier
    Luukka, Pasi
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (04) : 4600 - 4607
  • [9] Multi-label feature selection based on relative entropy and fuzzy neighborhood mutual discrimination index
    Wang, Chenxi
    Chen, E.
    Ren, Mengli
    Guo, Lei
    Yu, Xiehua
    Lin, Yaojin
    Li, Shaozi
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (27):
  • [10] Label distribution feature selection based on neighborhood rough set
    Wu, Yilin
    Guo, Wenzhong
    Lin, Yaojin
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2024, 36 (23):