PreCLAS: An Evolutionary Tool for Unsupervised Feature Selection

被引:1
|
作者
Carballido, Jessica A. [1 ]
Ponzoni, Ignacio [1 ]
Cecchini, Rocio L. [1 ]
机构
[1] Univ Nacl Sur, Dept Comp Sci & Engn, Inst Comp Sci & Engn UNS, CONICET, Bahia Blanca, Buenos Aires, Argentina
关键词
Clustering tendency; Classification strategies; Evolutionary algorithm; Unsupervised feature selection; Microarray data analysis; INSTANCE SELECTION; REDUCTION;
D O I
10.1007/978-3-030-61705-9_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Several research areas are being faced with data matrices that are not suitable to be managed with traditional clustering, regression, or classification strategies. For example, biological so-called omic problems present models with thousands or millions of rows and less than a hundred columns. This matrix structure hinders the successful progress of traditional data analysis methods and thus needs some means for reducing the number of rows. This article presents an unsupervised approach called PreCLAS for preprocessing matrices with dimension problems to obtain data that are apt for clustering and classification strategies. The PreCLAS was implemented as an unsupervised strategy that aims at finding a submatrix with a drastically reduced number of rows, preferring those rows that together present some group structure. Experimentation was carried out in two stages. First, to assess its functionality, a benchmark dataset was studied in a clustering context. Then, a microarray dataset with genomic information was analyzed, and the PreCLAS was used to select informative genes in the context of classification strategies. Experimentation showed that the new method performs successfully at drastically reducing the number of rows of a matrix, smartly performing unsupervised feature selection for both classification and clustering problems.
引用
收藏
页码:172 / 182
页数:11
相关论文
共 50 条
  • [41] Unsupervised feature selection with ensemble learning
    Elghazel, Haytham
    Aussem, Alex
    MACHINE LEARNING, 2015, 98 (1-2) : 157 - 180
  • [42] Consensus Guided Unsupervised Feature Selection
    Liu, Hongfu
    Shao, Ming
    Fu, Yun
    THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, : 1874 - 1880
  • [43] An efficient framework for unsupervised feature selection
    Zhang, Han
    Zhang, Rui
    Nie, Feiping
    Li, Xuelong
    NEUROCOMPUTING, 2019, 366 : 194 - 207
  • [44] Unsupervised feature selection with ensemble learning
    Haytham Elghazel
    Alex Aussem
    Machine Learning, 2015, 98 : 157 - 180
  • [45] Unsupervised Feature Selection Approach for Smartwatches
    Kapse, Manohar
    Sharma, Vinod
    Elangovan, N.
    Gupta, Suchita
    FOURTH CONGRESS ON INTELLIGENT SYSTEMS, VOL 2, CIS 2023, 2024, 869 : 467 - 481
  • [46] Discriminative embedded unsupervised feature selection
    Zhu, Qi-Hai
    Yang, Yu-Bin
    PATTERN RECOGNITION LETTERS, 2018, 112 : 219 - 225
  • [47] Feature Selection with Unsupervised Consensus Guidance
    Liu, Hongfu
    Shao, Ming
    Fu, Yun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (12) : 2319 - 2331
  • [48] Unsupervised Feature Selection for Noisy Data
    Mahdavi, Kaveh
    Labarta, Jesus
    Gimenez, Judit
    ADVANCED DATA MINING AND APPLICATIONS, ADMA 2019, 2019, 11888 : 79 - 94
  • [49] Structure preserving unsupervised feature selection
    Lu, Quanmao
    Li, Xuelong
    Dong, Yongsheng
    NEUROCOMPUTING, 2018, 301 : 36 - 45
  • [50] Unsupervised Robust Bayesian Feature Selection
    Sun, Jianyong
    Zhou, Aimin
    PROCEEDINGS OF THE 2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2014, : 558 - 564