Subspace learning for unsupervised feature selection via matrix factorization

被引:135
|
作者
Wang, Shiping [1 ,2 ]
Pedrycz, Witold [2 ,3 ]
Zhu, Qingxin [1 ]
Zhu, William [4 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB T6G 2G7, Canada
[3] Polish Acad Sci, Syst Res Inst, PL-01447 Warsaw, Poland
[4] Minnan Normal Univ, Lab Granular Comp, Zhangzhou 363000, Peoples R China
基金
中国国家自然科学基金;
关键词
Machine learning; Feature selection; Unsupervised learning; Matrix factorization; Subspace distance; Kernel method; SINGULAR-VALUE DECOMPOSITION; INPUT FEATURE-SELECTION; MUTUAL INFORMATION; EFFICIENT; CONSISTENCY; ALGORITHMS;
D O I
10.1016/j.patcog.2014.08.004
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dimensionality reduction is an important and challenging task in machine learning and data mining. Feature selection and feature extraction are two commonly used techniques for decreasing dimensionality of the data and increasing efficiency of learning algorithms. Specifically, feature selection realized in the absence of class labels, namely unsupervised feature selection, is challenging and interesting. In this paper, we propose a new unsupervised feature selection criterion developed from the viewpoint of subspace learning, which is treated as a matrix factorization problem. The advantages of this work are four-fold. First, dwelling on the technique of matrix factorization, a unified framework is established for feature selection, feature extraction and clustering. Second, an iterative update algorithm is provided via matrix factorization, which is an efficient technique to deal with high-dimensional data. Third, an effective method for feature selection with numeric data is put forward, instead of drawing support from the discretization process. Fourth, this new criterion provides a sound foundation for embedding kernel tricks into feature selection. With this regard, an algorithm based on kernel methods is also proposed. The algorithms are compared with four state-of-the-art feature selection methods using six publicly available datasets. Experimental results demonstrate that in terms of clustering results, the proposed two algorithms come with better performance than the others for almost all datasets we experimented with here. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:10 / 19
页数:10
相关论文
共 50 条
  • [1] Robust unsupervised feature selection via matrix factorization
    Du, Shiqiang
    Ma, Yide
    Li, Shouliang
    Ma, Yurun
    [J]. NEUROCOMPUTING, 2017, 241 : 115 - 127
  • [2] Unsupervised feature selection by regularized matrix factorization
    Qi, Miao
    Wang, Ting
    Liu, Fucong
    Zhang, Baoxue
    Wang, Jianzhong
    Yi, Yugen
    [J]. NEUROCOMPUTING, 2018, 273 : 593 - 610
  • [3] Subspace learning for feature selection via rank revealing QR factorization: Fast feature selection
    Moslemi, Amir
    Ahmadian, Arash
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 256
  • [4] Structured learning for unsupervised feature selection with high-order matrix factorization
    Wang, Shiping
    Chen, Jiawei
    Guo, Wenzhong
    Liu, Genggeng
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2020, 140
  • [5] Ordinal preserving matrix factorization for unsupervised feature selection
    Yi, Yugen
    Zhou, Wei
    Liu, Qinghua
    Luo, Guoliang
    Wang, Jianzhong
    Fang, Yuming
    Zheng, Caixia
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2018, 67 : 118 - 131
  • [6] Subspace learning for unsupervised feature selection via adaptive structure learning and rank approximation
    Shang, Ronghua
    Xu, Kaiming
    Jiao, Licheng
    [J]. NEUROCOMPUTING, 2020, 413 (413) : 72 - 84
  • [7] Subspace Clustering via Joint Unsupervised Feature Selection
    Dong, Wenhua
    Wu, Xiao-Jun
    Li, Hui
    Feng, Zhen-Hua
    Kittler, Josef
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 3892 - 3898
  • [8] Robust Sparse Subspace Learning for Unsupervised Feature Selection
    Wang, Feng
    Rao, Qi
    Zhang, Yongquan
    Chen, Xu
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 4205 - 4212
  • [9] Transfer Learning via Feature Selection Based Nonnegative Matrix Factorization
    Balasubramaniam, Thirunavukarasu
    Nayak, Richi
    Yuen, Chau
    [J]. WEB INFORMATION SYSTEMS ENGINEERING - WISE 2019, 2019, 11881 : 82 - 97
  • [10] Unsupervised feature selection based on matrix factorization and adaptive graph
    Cao, Langcai
    Lin, Xiaochang
    Su, Sixing
    [J]. Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2021, 43 (08): : 2197 - 2208