Reducing Class Overlapping in Supervised Dimension Reduction

被引:1
|
作者
Nguyen Trong Tung [1 ]
Vu Hoang Dieu [1 ]
Khoat Than [1 ]
Ngo Van Linh [1 ]
机构
[1] Hanoi Univ Sci & Technol, Hanoi, Vietnam
关键词
Class overlapping; Supervised dimension reduction; Probabilistic topic models;
D O I
10.1145/3287921.3287925
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Dimension reduction is to find a low-dimensional subspace to project high-dimensional data on, such that the discriminative property of the original higher-dimensional data is preserved. In supervised dimension reduction, class labels are integrated into the lower-dimensional representation, to produce better results on classification tasks. The supervised dimension reduction (SDR) framework by [17] is one of the state-of-the-art methods that takes into account not only the class labels but also the neighborhood graphs of the data, and have some advantages in preserving the within-class local structure and widening the between-class margin. However, the reduced-dimensional representation produced by the SDR framework suffers from the class overlapping problem - in which, data points lie closer to a different class rather than the class they belong to. The class overlapping problem can hurt the quality on the classification task. In this paper, we propose a new method to reduce the overlap for the SDR framework in [17]. The experimental results show that our method reduces the size of the overlapping set by an order of magnitude. As a result, our method outperforms the pre-existing framework on the classification task significantly. Moreover, visualization plots show that the reduced-dimensional representation learned by our method is more scattered for within class data and more separated for between-class data, as compared to the pre-existing SDR framework.
引用
收藏
页码:8 / 15
页数:8
相关论文
共 50 条
  • [1] Dimension reduction for supervised ordering
    Kamishima, Toshihiro
    Akaho, Shotaro
    [J]. ICDM 2006: SIXTH INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2006, : 330 - +
  • [3] Supervised dimension reduction for ordinal predictors
    Forzani, Liliana
    Garcia Arancibia, Rodrigo
    Llop, Pamela
    Tomassi, Diego
    [J]. COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2018, 125 : 136 - 155
  • [4] ONLINE LEARNING FOR SUPERVISED DIMENSION REDUCTION
    Zhang, Ning
    Wu, Qiang
    [J]. MATHEMATICAL FOUNDATIONS OF COMPUTING, 2019, 2 (02): : 95 - 106
  • [5] An effective framework for supervised dimension reduction
    Khoat Than
    Tu Bao Ho
    Duy Khuong Nguyen
    [J]. NEUROCOMPUTING, 2014, 139 : 397 - 407
  • [6] Semi-Supervised Sufficient Dimension Reduction under Class-Prior Change
    Kawakubo, Hideko
    Sugiyama, Masashi
    [J]. 2016 CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI), 2016, : 146 - 153
  • [7] Overlapping sliced inverse regression for dimension reduction
    Zhang, Ning
    Yu, Zhou
    Wu, Qiang
    [J]. ANALYSIS AND APPLICATIONS, 2019, 17 (05) : 715 - 736
  • [8] On the Dimension Reduction of Radio Maps with a Supervised Approach
    Jia, Bing
    Huang, Baoqi
    Gao, Hepeng
    Li, Wuyungerile
    [J]. 2017 IEEE 42ND CONFERENCE ON LOCAL COMPUTER NETWORKS (LCN), 2017, : 199 - 202
  • [9] Signed Laplacian Embedding for Supervised Dimension Reduction
    Gong, Chen
    Tao, Dacheng
    Yang, Jie
    Fu, Keren
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2014, : 1847 - 1853
  • [10] Supervised dimension reduction for functional time series
    Wang, Guochang
    Wen, Zengyao
    Jia, Shanming
    Liang, Shanshan
    [J]. STATISTICAL PAPERS, 2024, 65 (07) : 4057 - 4077