Dual-dual subspace learning with low-rank consideration for feature selection

被引:0
|
作者
Moslemi, Amir [1 ]
Bidar, Mahdi [2 ]
机构
[1] Seneca Polytech, Sch Software Design & Data Sci, Toronto, ON M2J 2X5, Canada
[2] Univ Regina, Dept Comp Sci, Regina, SK, Canada
关键词
Nonnegative matrix factorization; Unsupervised feature selection; Regularization; Low-rank; UNSUPERVISED FEATURE-SELECTION; SUPERVISED FEATURE-SELECTION; MATRIX FACTORIZATION; APPROXIMATION;
D O I
10.1016/j.physa.2024.129997
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
The performance of machine learning algorithms can be affected by redundant features of highdimensional data. Furthermore, these irrelevant features increase the time of computation for learning model. These problems can be addressed by leveraging different techniques including feature selection and dimensionality reduction. Unsupervised feature selection has drawn increased attention due to the difficulty of label collection for supervised feature selection. To this end, we developed an innovative approach based on nonnegative matrix factorization (NMF) to remove redundant information. In this technique, for the first time, the local information preserving regularization and global information preserving regularization are applied for both feature weight matrix and representation matrix which is why we called Dual-Dual regularized feature selection. Furthermore, Schatten p-norm is utilized to extract inherent low-rank properties of data. To demonstrate the effectiveness of the proposed method, experimental studies are conducted on six benchmark datasets. The computational results show that the proposed method in comparison with state-of-the-art unsupervised feature selection techniques is more efficient for feature selection.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Transfer subspace learning joint low-rank representation and feature selection
    Liran Yang
    Qinghua Zhou
    [J]. Multimedia Tools and Applications, 2022, 81 : 38353 - 38373
  • [2] Transfer subspace learning joint low-rank representation and feature selection
    Yang, Liran
    Zhou, Qinghua
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (27) : 38353 - 38373
  • [3] Robust Multi-View Subspace Learning through Dual Low-Rank Decompositions
    Ding, Zhengming
    Fu, Yun
    [J]. THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1181 - 1187
  • [4] Feature selection embedded subspace clustering with low-rank and locality constraints
    You, Cong-Zhe
    Wu, Xiao-Jun
    [J]. 2018 IEEE INTERNATIONAL SMART CITIES CONFERENCE (ISC2), 2018,
  • [5] Low-rank dictionary learning for unsupervised feature selection
    Parsa, Mohsen Ghassemi
    Zare, Hadi
    Ghatee, Mehdi
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 202
  • [6] Dual Graph Regularized Latent Low-Rank Representation for Subspace Clustering
    Yin, Ming
    Gao, Junbin
    Lin, Zhouchen
    Shi, Qinfeng
    Guo, Yi
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2015, 24 (12) : 4918 - 4933
  • [7] Low-Rank Transfer Subspace Learning
    Shao, Ming
    Castillo, Carlos
    Gu, Zhenghong
    Fu, Yun
    [J]. 12TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2012), 2012, : 1104 - 1109
  • [8] Consistent affinity representation learning with dual low-rank constraints for multi-view subspace clustering
    Fu, Lele
    Li, Jieling
    Chen, Chuan
    [J]. NEUROCOMPUTING, 2022, 514 : 113 - 126
  • [9] Adaptive structure learning for low-rank supervised feature selection
    Zhu, Yonghua
    Zhang, Xuejun
    Hu, Rongyao
    Wen, Guoqiu
    [J]. PATTERN RECOGNITION LETTERS, 2018, 109 : 89 - 96
  • [10] Low-Rank Sparse Feature Selection for Patient Similarity Learning
    Zhan, Mengting
    Cao, Shilei
    Qian, Buyue
    Chang, Shiyu
    Wei, Jishang
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 1335 - 1340