Unsupervised feature selection based on matrix factorization and adaptive graph

被引:0
|
作者
Cao L. [1 ,2 ]
Lin X. [1 ,2 ]
Su S. [1 ,2 ]
机构
[1] School of Aerospace Engineering, Xiamen University, Xiamen
[2] Xiamen Key Laboratory of Big Data Intelligent Analysis and Decision, Xiamen
关键词
Adaptation; Feature selection; Graph embedding; Matrix factorization;
D O I
10.12305/j.issn.1001-506X.2021.08.22
中图分类号
学科分类号
摘要
Due to the so-called curse of dimensionality, which is inevitable and tricky in high-dimensional data analytics, it is of great importance to perform dimensionality reduction via feature selection methods. Therefore, an unsupervised feature selection model based on robust matrix factorization and adaptive graph (MFAGFS) is proposed, which can perform robust matrix factorization, feature selection and local structure learning under a unified learning framework. The model first obtains cluster tags by robust matrix decomposition, cluster tags and local structure information are used to guide the feature selection process. Then, learning the local structure of the data adaptively from the result of feature selection. MFAGFS can accurately capture the structure information of the data and select discriminative features through the interaction between the two basic tasks of the local structure learning and feature selection. Then, the optimization method of the algorithm is described in detail, and the convergence of the algorithm is proved. Finally, experimental comparative analysis and parameter sensitivity analysis are carried out on six public data sets to verify the effectiveness of the proposed model. The experimental result shows that the performance of the proposed methods presented is improved in different degrees compared with other methods. © 2021, Editorial Office of Systems Engineering and Electronics. All right reserved.
引用
收藏
页码:2197 / 2208
页数:11
相关论文
共 30 条
  • [1] BAO F, YIN K X., Review and progress of feature selection algorithms, Science and Technology Wind, 6, (2020)
  • [2] NIE F P, XIANG S M, JIA Y Q, Et al., Trace ratio criterion for feature selection, Proc.of the 23rd AAAI Conference on Artificial Intelligence, pp. 671-676, (2008)
  • [3] YANG Y, SHEN H T, MA Z, Et al., L21-norm regularized discriminative feature selection for unsupervised learning, Proc.of the 22nd International Joint Conference on Artificial Intelligence, pp. 1589-1594, (2011)
  • [4] CAI D, HE X, HAN J W., Spectral regression: a unified approach for sparse subspace learning, Proc.of the IEEE 7th International Conference on Data Mining, pp. 73-82, (2007)
  • [5] CAI D, ZHANG C Y, HE X F., Unsupervised feature selection for multi-cluster data, Proc.of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, (2010)
  • [6] ZHAO Z, WANG L, LIU H., Efficient spectral feature selection with minimum redundancy, Proc.of the 24th AAAI Conference on Artificial Intelligence, pp. 673-678, (2010)
  • [7] LIU X W, WANG L, ZHANG J., Global and local structure preservation for feature selection, IEEE Trans.on Neural Networks & Learning Systems, 25, 6, pp. 1083-1095, (2014)
  • [8] HOU C P, NIE F P, LI X L, Et al., Joint embedding learning and sparse regression: a framework for unsupervised feature selection, IEEE Trans.on Cybern, 44, 6, pp. 793-804, (2013)
  • [9] QIAN M J, ZHAI C X., Robust unsupervised feature selection, Proc.of the 23rd International Joint Conference on Artificial Intelligence, (2013)
  • [10] LI Z C, YANG Y, LIU J, Et al., Unsupervised feature selection using nonnegative spectral analysis, Proc.of the 26th AAAI Conference on Artificial Intelligence, pp. 1026-1032, (2012)