Bi-Level Spectral Feature Selection

被引:0
|
作者
Hu, Zebiao [1 ]
Wang, Jian [2 ]
Zhang, Kai [3 ,4 ]
Pedrycz, Witold [5 ,6 ,7 ,8 ]
Pal, Nikhil R. [9 ,10 ]
机构
[1] China Univ Petr East China, Coll Control Sci & Engn, Qingdao 266580, Peoples R China
[2] China Univ Petr East China, Coll Sci, Qingdao 266580, Peoples R China
[3] China Univ Petr, Coll Petr Engn, Qingdao 266580, Peoples R China
[4] Qingdao Univ Technol, Sch Civil Engn, Qingdao 266520, Peoples R China
[5] Univ Alberta, Dept Elect & Comp Engn, Edmonton, AB T6G 2R3, Canada
[6] Macau Univ Sci & Technol, Inst Syst Engn, Macau, Peoples R China
[7] Syst Res Inst, Polish Acad Sci, PL-00901 Warsaw, Poland
[8] Istinye Univ, Res Ctr Performance & Prod Anal, TR-34010 Istanbul, Turkiye
[9] Techno India Univ, Kolkata 700091, India
[10] South Asian Univ, New Delhi 110068, India
基金
中国国家自然科学基金;
关键词
Feature extraction; Task analysis; Petroleum; Clustering algorithms; Classification algorithms; Optimization; Linear programming; Bi-level spectral feature selection (BLSFS); classification level; feature level; high-dimensional data; unsupervised learning; UNSUPERVISED FEATURE-SELECTION; SPARSE; GRAPH; REPRESENTATION; SCORE;
D O I
10.1109/TNNLS.2024.3408208
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised feature selection (UFS) aims to learn an indicator matrix relying on some characteristics of the high-dimensional data to identify the features to be selected. However, traditional unsupervised methods perform only at the feature level, i.e., they directly select useful features by feature ranking. Such methods do not pay any attention to the interaction information with other tasks such as classification, which severely degrades their feature selection performance. In this article, we propose an UFS method which also takes into account the classification level, and selects features that perform well both in clustering and classification. To achieve this, we design a bi-level spectral feature selection (BLSFS) method, which combines classification level and feature level. More concretely, at the classification level, we first apply the spectral clustering to generate pseudolabels, and then train a linear classifier to obtain the optimal regression matrix. At the feature level, we select useful features via maintaining the intrinsic structure of data in the embedding space with the learned regression matrix from the classification level, which in turn guides classifier training. We utilize a balancing parameter to seamlessly bridge the classification and feature levels together to construct a unified framework. A series of experiments on 12 benchmark datasets are carried out to demonstrate the superiority of BLSFS in both clustering and classification performance.
引用
下载
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [1] Bi-level ensemble method for unsupervised feature selection
    Zhou, Peng
    Wang, Xia
    Du, Liang
    INFORMATION FUSION, 2023, 100
  • [2] Feature construction as a bi-level optimization problem
    Hammami, Marwa
    Bechikh, Slim
    Louati, Ali
    Makhlouf, Mohamed
    Ben Said, Lamjed
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (17): : 13783 - 13804
  • [3] Feature construction as a bi-level optimization problem
    Marwa Hammami
    Slim Bechikh
    Ali Louati
    Mohamed Makhlouf
    Lamjed Ben Said
    Neural Computing and Applications, 2020, 32 : 13783 - 13804
  • [4] Bi-level feature selection in high dimensional AFT models with applications to a genomic study
    Huang, Hailin
    Shangguan, Jizi
    Ruan, Peifeng
    Liang, Hua
    STATISTICAL APPLICATIONS IN GENETICS AND MOLECULAR BIOLOGY, 2019, 18 (05)
  • [5] A bi-level optimization model for technology selection
    Aviso, Kathleen B.
    Chiu, Anthony S. F.
    Ubando, Aristotle T.
    Tan, Raymond R.
    JOURNAL OF INDUSTRIAL AND PRODUCTION ENGINEERING, 2021, 38 (08) : 573 - 580
  • [6] Interpretability of bi-level variable selection methods
    Buch, Gregor
    Schulz, Andreas
    Schmidtmann, Irene
    Strauch, Konstantin
    Wild, Philipp S.
    BIOMETRICAL JOURNAL, 2024, 66 (02)
  • [7] SPARSE REGULARIZATION FOR BI-LEVEL VARIABLE SELECTION
    Matsui, Hidetoshi
    JOURNAL JAPANESE SOCIETY OF COMPUTATIONAL STATISTICS, 2015, 28 (01): : 83 - 103
  • [8] Penalized methods for bi-level variable selection
    Breheny, Patrick
    Huang, Jian
    STATISTICS AND ITS INTERFACE, 2009, 2 (03) : 369 - 380
  • [9] Bayesian group bridge for bi-level variable selection
    Mallick, Himel
    Yi, Nengjun
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2017, 110 : 115 - 133
  • [10] Sparse Group Penalties for bi-level variable selection
    Buch, Gregor
    Schulz, Andreas
    Schmidtmann, Irene
    Strauch, Konstantin
    Wild, Philipp S.
    BIOMETRICAL JOURNAL, 2024, 66 (04)