Improving Fusion of Dimensionality Reduction Methods for Nearest Neighbor Classification

被引:2
|
作者
Deegalla, Sampath [1 ]
Bostrom, Henrik [1 ]
机构
[1] Stockholm Univ, Dept Comp & Syst Sci, SE-16440 Kista, Sweden
关键词
nearest neighbor classification; dimensionality reduction; feature fusion; classifier fusion; microarrays; CANCER; TUMOR; PREDICTION; PATTERNS;
D O I
10.1109/ICMLA.2009.95
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In previous studies, performance improvement of nearest neighbor classification of high dimensional data, such as microarrays, has been investigated using dimensionality reduction. It has been demonstrated that the fusion of dimensionality reduction methods, either by fusing classifiers obtained from each set of reduced features, or by fusing all reduced features are better than than using any single dimensionality reduction method. However, none of the fusion methods consistently outperform the use of a single dimensionality reduction method. Therefore, a new way of fusing features and classifiers is proposed, which is based on searching for the optimal number of dimensions for each considered dimensionality reduction method. An empirical evaluation on microarray classification is presented, comparing classifier and feature fusion with and without the proposed method, in conjunction with three dimensionality reduction methods; Principal Component Analysis (PCA), Partial Least Squares (PLS) and Information Gain (IG). The new classifier fusion method outperforms the previous in 4 out of 8 cases, and is on par with the best single dimensionality reduction method. The novel feature fusion method is however outperformed by the previous method, which selects the same number of features from each dimensionality reduction method. Hence, it is concluded that the idea of optimizing the number of features separately for each dimensionality reduction method can only be recommended for classifier fusion.
引用
收藏
页码:771 / 775
页数:5
相关论文
共 50 条
  • [1] Dimensionality reduction by minimizing nearest-neighbor classification error
    Villegas, Mauricio
    Paredes, Roberto
    [J]. PATTERN RECOGNITION LETTERS, 2011, 32 (04) : 633 - 639
  • [2] Unsupervised nearest neighbor regression for dimensionality reduction
    Kramer, Oliver
    [J]. SOFT COMPUTING, 2015, 19 (06) : 1647 - 1661
  • [3] Unsupervised nearest neighbor regression for dimensionality reduction
    Oliver Kramer
    [J]. Soft Computing, 2015, 19 : 1647 - 1661
  • [4] SENSING AWARE DIMENSIONALITY REDUCTION FOR NEAREST NEIGHBOR CLASSIFICATION OF HIGH DIMENSIONAL SIGNALS
    Sun, Zachary
    Karl, W. Clem
    Ishwar, Prakash
    Saligrarna, Venkatesh
    [J]. 2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2012, : 405 - 408
  • [5] Dimensionality reduction for documents with nearest neighbor queries
    Ingram, Stephen
    Munzner, Tamara
    [J]. NEUROCOMPUTING, 2015, 150 : 557 - 569
  • [6] Hierarchical Nearest Neighbor Graph Embedding for Efficient Dimensionality Reduction
    Sarfraz, M. Saquib
    Koulakis, Marios
    Seibold, Constantin
    Stiefelhagen, Rainer
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 336 - 345
  • [7] Fusion of Dimensionality Reduction Methods: a Case Study in Microarray Classification
    Deegalla, Sampath
    Bostrom, Henrik
    [J]. FUSION: 2009 12TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION, VOLS 1-4, 2009, : 460 - +
  • [8] Iterative Nearest Neighbors for Classification and Dimensionality Reduction
    Timofte, Radu
    Van Gool, Luc
    [J]. 2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2012, : 2456 - 2463
  • [9] Improving nearest neighbor classification with cam weighted distance
    Zhou, CY
    Chen, YQ
    [J]. PATTERN RECOGNITION, 2006, 39 (04) : 635 - 645
  • [10] Improving Nearest Neighbor classification with simulated gravitational collapse
    Wang, C
    Chen, YQ
    [J]. ADVANCES IN NATURAL COMPUTATION, PT 3, PROCEEDINGS, 2005, 3612 : 845 - 854