Sparse and low-dimensional representation with maximum entropy adaptive graph for feature selection

被引:21
|
作者
Shang, Ronghua [1 ]
Zhang, Xinlei [1 ]
Feng, Jie [1 ]
Li, Yangyang [1 ]
Jiao, Licheng [1 ]
机构
[1] Xidian Univ, Key Lab Intelligent Percept & Image Understanding, Collaborat Innovat Ctr Quantum Informat Shaanxi P, Minist Educ,Sch Artificial Intelligence, Xian 710071, Shaanxi, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Sparse transformation; Pseudo-label matrix; Maximum entropy; Adaptive manifold learning; Feature selection; UNSUPERVISED FEATURE-SELECTION; NONNEGATIVE MATRIX FACTORIZATION; REGRESSION; SIMILARITY; LOCALITY;
D O I
10.1016/j.neucom.2022.02.038
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Traditional feature selection algorithms usually explore the relationship between data and cluster structure in a single space, so the internal relationship obtained is not very rich, and it is not enough to select more valuable features. To solve the above problems, in order to fully mine the intrinsic correlation information in different spaces, so that the relationship between data, features and clustering structure can be explored at the same time, this paper proposes a feature selection method based on sparse and low dimensional representation with maximum entropy adaptive graph (SLMEA). Firstly, the SLMEA combines the sparse transform representation with pseudo-label matrix learning to optimize, and uses the pseudo-label matrix to guide the learning of sparse low-dimensional space. It can not only explore the relationship between data and pseudo-labels in data space, but also mine the association between features and pseudo-labels in feature space, so as to select more discriminative features. Secondly, based on the maximum entropy theory, the similarity matrix is constructed adaptively, so that the two different manifold structures corresponding to sparse transform representation and pseudo label matrix learning can be adaptively learned and retained in the iterative process. In addition, in order to ensure the sparsity of the transformation matrix, the constraint of l(2,1/2)-norm is applied to the matrix, which can better deal with the redundant features and obtain more sparse solutions. Finally, the SLMEA uses an alternate iterative update method to optimize the objective function, and carries out extensive experiments on eight mainstream datasets. Compared with seven state-of-the-art algorithms, SLMEA can have higher clustering accuracy and normalized mutual information. (C) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:57 / 73
页数:17
相关论文
共 50 条
  • [1] Maximum Entropy Linear Manifold for Learning Discriminative Low-Dimensional Representation
    Czarnecki, Wojciech Marian
    Jozefowicz, Rafal
    Tabor, Jacek
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2015, PT I, 2015, 9284 : 52 - 67
  • [2] Unsupervised feature selection via graph matrix learning and the low-dimensional space learning for classification
    Han, Xiaohong
    Liu, Ping
    Wang, Li
    Li, Dengao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020, 87
  • [3] Semi-supervised sparse feature selection based on low-dimensional space Hessian regularization considering feature manifolds
    Wu, Xinping
    Chen, Hongmei
    Li, Tianrui
    Li, Chuanwei
    DEVELOPMENTS OF ARTIFICIAL INTELLIGENCE TECHNOLOGIES IN COMPUTATION AND ROBOTICS, 2020, 12 : 93 - 100
  • [4] Low-Dimensional Sensory Feature Representation by Trigeminal Primary Afferents
    Bale, Michael R.
    Davies, Kyle
    Freeman, Oliver J.
    Ince, Robin A. A.
    Petersen, Rasmus S.
    JOURNAL OF NEUROSCIENCE, 2013, 33 (29): : 12003 - 12012
  • [5] How can a sparse representation be made applicable for very low-dimensional data?
    Zhang, Hongzhi
    Li, Feng
    Liu, Pengju
    Chen, Yan
    Ren, Dongwei
    Wang, Kuanquan
    EXPERT SYSTEMS WITH APPLICATIONS, 2017, 77 : 66 - 70
  • [6] Robust Adaptive Low-Rank and Sparse Embedding for Feature Representation
    Wang, Lei
    Zhang, Zhao
    Liu, Guangcan
    Ye, Qiaolin
    Qin, Jie
    Wang, Meng
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 800 - 805
  • [7] Low-dimensional topology, low-dimensional field theory and representation theory
    Fuchs, Juergen
    Schweigert, Christoph
    REPRESENTATION THEORY - CURRENT TRENDS AND PERSPECTIVES, 2017, : 255 - 267
  • [8] Feature Selection Based on Graph Representation
    Akhiat, Yassine
    Chahhou, Mohamed
    Zinedine, Ahmed
    2018 IEEE 5TH INTERNATIONAL CONGRESS ON INFORMATION SCIENCE AND TECHNOLOGY (IEEE CIST'18), 2018, : 232 - 237
  • [9] Sparse low-redundancy multi-label feature selection with adaptive dynamic dual graph constraints
    Wu, Yanhong
    Bai, Jianxia
    APPLIED INTELLIGENCE, 2025, 55 (03)
  • [10] Multi-feature sparse representation based on adaptive graph constraint for cropland delineation
    Zeng, Shaohua
    Wang, Meiyang
    Jia, Hongjie
    Hu, Jing
    Li, Jiao
    OPTICS EXPRESS, 2024, 32 (04) : 6463 - 6480