Double-dictionary learning unsupervised feature selection cooperating with low-rank and sparsity

被引:0
|
作者
Shang, Ronghua [1 ]
Song, Jiuzheng [1 ]
Gao, Lizhuo [1 ]
Lu, Mengyao [1 ]
Jiao, Licheng [1 ]
Xu, Songhua [2 ]
Li, Yangyang [1 ]
机构
[1] Key Laboratory of Intelligent Perception and Image Understanding of Ministry of Education, School of Artificial Intelligence, Xidian University, Shaanxi Province, Xi'an,710071, China
[2] Department of Health Management & Institute of Medical Artificial Intelligence, The Second Affiliated Hospital of Xi'an Jiaotong University, Xi'an, China
基金
中国国家自然科学基金;
关键词
Consensus algorithm - Unsupervised learning;
D O I
10.1016/j.knosys.2024.112566
中图分类号
学科分类号
摘要
The feature selection algorithm based on dictionary learning has been widely studied for its excellent interpretability. In the feature selection process, many algorithms only consider the global or local geometric structure information of the original data. A few algorithms that utilize the global and local information together do not actually use the two parts synchronously. Because of this, the information of the two parts cannot be fully utilized reasonably. For this reason, a novel feature selection algorithm, double-dictionary learning unsupervised feature selection cooperating with low-rank and sparsity (LRSDFS), is proposed in this paper. First, LRSDFS improves the traditional dictionary learning by synchronously reconstructing the original dataset into two dictionaries simultaneously. Second, the low-rank and sparsity constraint are applied to the two dictionaries, so that the reconstructed dictionary can retain the global and local information of the original data simultaneously. Finally, the global and local information are weighted to realize the feature selection of the dataset, making the selected features more reasonable and interpretable. LRSDFS is compared with seven state of the art algorithms, including baseline, and evaluated on nine publicly available benchmark datasets. The results show that LRSDFS is more efficient than other unsupervised feature selection algorithms. © 2024
引用
收藏
相关论文
共 50 条
  • [31] Adaptive dictionary and structure learning for unsupervised feature selection
    Guo, Yanrong
    Sun, Huihui
    Hao, Shijie
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (03)
  • [32] Joint dictionary and graph learning for unsupervised feature selection
    Deqiong Ding
    Fei Xia
    Xiaogao Yang
    Chang Tang
    Applied Intelligence, 2020, 50 : 1379 - 1397
  • [33] UNSUPERVISED FEATURE SELECTION BY NONNEGATIVE SPARSITY ADAPTIVE SUBSPACE LEARNING
    Zhou, Nan
    Cheng, Hong
    Zheng, Ya-Li
    He, Liang-Tian
    Pedrycz, Witold
    PROCEEDINGS OF 2016 INTERNATIONAL CONFERENCE ON WAVELET ANALYSIS AND PATTERN RECOGNITION (ICWAPR), 2016, : 18 - 24
  • [34] Adaptive graph learning and low-rank constraint for supervised spectral feature selection
    Zhi Zhong
    Neural Computing and Applications, 2020, 32 : 6503 - 6512
  • [35] Low-Rank Correlation Learning for Unsupervised Domain Adaptation
    Lu, Yuwu
    Wong, Wai Keung
    Yuan, Chun
    Lai, Zhihui
    Li, Xuelong
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 4153 - 4167
  • [36] Adaptive graph learning and low-rank constraint for supervised spectral feature selection
    Zhong, Zhi
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (11): : 6503 - 6512
  • [37] Dual-dual subspace learning with low-rank consideration for feature selection
    Moslemi, Amir
    Bidar, Mahdi
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2024, 651
  • [38] LEARNING A LOW-RANK SHARED DICTIONARY FOR OBJECT CLASSIFICATION
    Vu, Tiep H.
    Monga, Vishal
    2016 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2016, : 4428 - 4432
  • [39] Discriminative low-rank dictionary learning for face recognition
    Hoangvu Nguyen
    Yang, Wankou
    Sheng, Biyun
    Sun, Changyin
    NEUROCOMPUTING, 2016, 173 : 541 - 551
  • [40] Learning low-rank and discriminative dictionary for image classification
    Li, Liangyue
    Li, Sheng
    Fu, Yun
    IMAGE AND VISION COMPUTING, 2014, 32 (10) : 814 - 823