Double-dictionary learning unsupervised feature selection cooperating with low-rank and sparsity

被引:0
|
作者
Shang, Ronghua [1 ]
Song, Jiuzheng [1 ]
Gao, Lizhuo [1 ]
Lu, Mengyao [1 ]
Jiao, Licheng [1 ]
Xu, Songhua [2 ]
Li, Yangyang [1 ]
机构
[1] Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Shaanxi Province, Xi'an,710071, China
[2] Department of Health Management & Institute of Medical Artificial Intelligence, The Second Affiliated Hospital of Xi'an Jiaotong University, Xi'an, China
基金
中国国家自然科学基金;
关键词
Consensus algorithm - Unsupervised learning;
D O I
10.1016/j.knosys.2024.112566
中图分类号
学科分类号
摘要
The feature selection algorithm based on dictionary learning has been widely studied for its excellent interpretability. In the feature selection process, many algorithms only consider the global or local geometric structure information of the original data. A few algorithms that utilize the global and local information together do not actually use the two parts synchronously. Because of this, the information of the two parts cannot be fully utilized reasonably. For this reason, a novel feature selection algorithm, double-dictionary learning unsupervised feature selection cooperating with low-rank and sparsity (LRSDFS), is proposed in this paper. First, LRSDFS improves the traditional dictionary learning by synchronously reconstructing the original dataset into two dictionaries simultaneously. Second, the low-rank and sparsity constraint are applied to the two dictionaries, so that the reconstructed dictionary can retain the global and local information of the original data simultaneously. Finally, the global and local information are weighted to realize the feature selection of the dataset, making the selected features more reasonable and interpretable. LRSDFS is compared with seven state of the art algorithms, including baseline, and evaluated on nine publicly available benchmark datasets. The results show that LRSDFS is more efficient than other unsupervised feature selection algorithms. © 2024
引用
下载
收藏
相关论文
共 50 条
  • [1] Low-rank dictionary learning for unsupervised feature selection
    Parsa, Mohsen Ghassemi
    Zare, Hadi
    Ghatee, Mehdi
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
  • [2] Unsupervised feature extraction by low-rank and sparsity preserving embedding
    Zhan, Shanhua
    Wu, Jigang
    Han, Na
    Wen, Jie
    Fang, Xiaozhao
    NEURAL NETWORKS, 2019, 109 : 56 - 66
  • [3] Unsupervised feature selection via low-rank approximation and structure learning
    Wang, Shiping
    Wang, Han
    KNOWLEDGE-BASED SYSTEMS, 2017, 124 : 70 - 79
  • [4] Unsupervised feature selection with graph learning via low-rank constraint
    Guangquan Lu
    Bo Li
    Weiwei Yang
    Jian Yin
    Multimedia Tools and Applications, 2018, 77 : 29531 - 29549
  • [5] Unsupervised feature selection with graph learning via low-rank constraint
    Lu, Guangquan
    Li, Bo
    Yang, Weiwei
    Yin, Jian
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (22) : 29531 - 29549
  • [6] Low-rank structure preserving for unsupervised feature selection
    Zheng, Wei
    Xu, Chunyan
    Yang, Jian
    Gao, Junbin
    Zhu, Fa
    NEUROCOMPUTING, 2018, 314 : 360 - 370
  • [7] Sparsity and Low-Rank Dictionary Learning for Sparse Representation of Monogenic Signal
    Dong, Ganggang
    Wang, Na
    Kuang, Gangyao
    Qiu, Hongbing
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2018, 11 (01) : 141 - 153
  • [8] Low-rank unsupervised graph feature selection via feature self-representation
    He, Wei
    Zhu, Xiaofeng
    Cheng, Debo
    Hu, Rongyao
    Zhang, Shichao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (09) : 12149 - 12164
  • [9] Low-rank unsupervised graph feature selection via feature self-representation
    Wei He
    Xiaofeng Zhu
    Debo Cheng
    Rongyao Hu
    Shichao Zhang
    Multimedia Tools and Applications, 2017, 76 : 12149 - 12164
  • [10] Multi-view unsupervised feature selection with tensor low-rank minimization
    Yuan, Haoliang
    Li, Junyu
    Liang, Yong
    Tang, Yuan Yan
    NEUROCOMPUTING, 2022, 487 : 75 - 85