Low-rank dictionary learning for unsupervised feature selection

被引:16
|
作者
Parsa, Mohsen Ghassemi [1 ]
Zare, Hadi [1 ]
Ghatee, Mehdi [2 ]
机构
[1] Univ Tehran, Fac New Sci & Technol, Tehran, Iran
[2] Amirkabir Univ Technol, Dept Math & Comp Sci, Tehran, Iran
关键词
Unsupervised feature selection; Dictionary learning; Sparse learning; Spectral analysis; Low-rank representation; ALGORITHM; REGRESSION;
D O I
10.1016/j.eswa.2022.117149
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
There are many high-dimensional data in real-world applications such as biology, computer vision, and social networks. Feature selection approaches are devised to confront high-dimensional data challenges with the aim of efficient learning technologies as well as reduction of models complexity. Due to the hardship of labeling on these datasets, there are a variety of approaches for the feature selection process in an unsupervised setting by considering some important characteristics of data. In this paper, we introduce a novel unsupervised feature selection approach by applying dictionary learning idea in a low-rank representation. Low-rank dictionary learning not only enables us to provide a new data representation but also maintains feature correlation. Then, spectral analysis is employed to preserve sample similarities. Finally, a unified objective function for unsupervised feature selection is proposed in a sparse way by an l2,1-norm regularization. Furthermore, an efficient numerical algorithm is designed to solve the corresponding optimization problem. We demonstrate the performance of the proposed method based on a variety of standard datasets from different applied domains. Our experimental findings reveal that the proposed method outperforms the state-of-the-art algorithms.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Unsupervised feature selection via low-rank approximation and structure learning
    Wang, Shiping
    Wang, Han
    [J]. KNOWLEDGE-BASED SYSTEMS, 2017, 124 : 70 - 79
  • [2] Unsupervised feature selection with graph learning via low-rank constraint
    Lu, Guangquan
    Li, Bo
    Yang, Weiwei
    Yin, Jian
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (22) : 29531 - 29549
  • [3] Unsupervised feature selection with graph learning via low-rank constraint
    Guangquan Lu
    Bo Li
    Weiwei Yang
    Jian Yin
    [J]. Multimedia Tools and Applications, 2018, 77 : 29531 - 29549
  • [4] Low-rank structure preserving for unsupervised feature selection
    Zheng, Wei
    Xu, Chunyan
    Yang, Jian
    Gao, Junbin
    Zhu, Fa
    [J]. NEUROCOMPUTING, 2018, 314 : 360 - 370
  • [5] Low-rank unsupervised graph feature selection via feature self-representation
    Wei He
    Xiaofeng Zhu
    Debo Cheng
    Rongyao Hu
    Shichao Zhang
    [J]. Multimedia Tools and Applications, 2017, 76 : 12149 - 12164
  • [6] Low-rank unsupervised graph feature selection via feature self-representation
    He, Wei
    Zhu, Xiaofeng
    Cheng, Debo
    Hu, Rongyao
    Zhang, Shichao
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2017, 76 (09) : 12149 - 12164
  • [7] Multi-view unsupervised feature selection with tensor low-rank minimization
    Yuan, Haoliang
    Li, Junyu
    Liang, Yong
    Tang, Yuan Yan
    [J]. NEUROCOMPUTING, 2022, 487 : 75 - 85
  • [8] Unsupervised Feature Selection via Metric Fusion and Novel Low-Rank Approximation
    Long, Yin
    Chen, Liang
    Li, Linfeng
    Shi, Rong
    [J]. IEEE ACCESS, 2022, 10 : 101474 - 101482
  • [9] Adaptive structure learning for low-rank supervised feature selection
    Zhu, Yonghua
    Zhang, Xuejun
    Hu, Rongyao
    Wen, Guoqiu
    [J]. PATTERN RECOGNITION LETTERS, 2018, 109 : 89 - 96
  • [10] Low-Rank Sparse Feature Selection for Patient Similarity Learning
    Zhan, Mengting
    Cao, Shilei
    Qian, Buyue
    Chang, Shiyu
    Wei, Jishang
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 1335 - 1340