Efficient Linear Feature Extraction Based on Large Margin Nearest Neighbor

被引:2
|
作者
Zhao, Guodong [1 ]
Zhou, Zhiyong [2 ]
机构
[1] Shanghai DianJi Univ, Sch Math & Phys, Shanghai 201306, Peoples R China
[2] Shanghai DianJi Univ, Sch Design & Art, Shanghai 201306, Peoples R China
关键词
Linear dimensionality reduction; large margin nearest neighbor; linear discriminant analysis; hard samples;
D O I
10.1109/ACCESS.2019.2921665
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Linear feature extraction methods have become indispensable tools in pattern recognition. The linear dimensionality reduction optimizes some objective to produce a linear transformation and derives the discriminative low-dimensional transformed data wherein the similarly labeled samples cluster tightly and the differently labeled samples keep away from one another. In the past, most of the methods achieve between-class distance by maximizing between-class-center-mean to make the differently labeled samples separated from each other. However, the samples in this study located in the class margin or within the other classes called hard samples often move slowly, which results in a small between-class distance and deteriorates the classification performance. Hence, we utilize the large margin nearest neighbor (LMNN) to quickly push only the hard samples toward the center of the class to get the large class margin instead of between-class, further improving the performance of classification. Combined with the linear discriminant analysis (LDA), a novel linear feature extraction method called LDA-LMNN is proposed. The method can address the limitations of LDA. Furthermore, the dropout is employed to the learning of linear transform matrix to improve the generalization ability and overcome the overfitting. The comparative experiments on various real-world datasets by using the state-of-the-art feature extraction methods demonstrate the effectiveness of the proposed method.
引用
收藏
页码:78616 / 78624
页数:9
相关论文
共 50 条
  • [1] Feature Extraction Algorithm Based on K Nearest Neighbor Local Margin
    Pan, Feng
    Wang, Jiandong
    Lin, Xiaohui
    [J]. PROCEEDINGS OF THE 2009 CHINESE CONFERENCE ON PATTERN RECOGNITION AND THE FIRST CJK JOINT WORKSHOP ON PATTERN RECOGNITION, VOLS 1 AND 2, 2009, : 20 - +
  • [2] Efficient Large Margin-Based Feature Extraction
    Zhao, Guodong
    Wu, Yan
    [J]. NEURAL PROCESSING LETTERS, 2019, 50 (02) : 1257 - 1279
  • [3] Efficient Large Margin-Based Feature Extraction
    Guodong Zhao
    Yan Wu
    [J]. Neural Processing Letters, 2019, 50 : 1257 - 1279
  • [4] A novel hierarchical feature selection method based on large margin nearest neighbor learning
    Zheng, Jian
    Luo, Chuan
    Li, Tianrui
    Chen, Hongmei
    [J]. NEUROCOMPUTING, 2022, 497 : 1 - 12
  • [5] Nearest Neighbor For Histogram-based Feature Extraction
    Mohamad, F. S.
    Manaf, A. A.
    Chuprat, S.
    [J]. PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE (ICCS), 2011, 4 : 1296 - 1305
  • [6] Large margin nearest neighbor classifiers
    Domeniconi, C
    Gunopulos, D
    Peng, J
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (04): : 899 - 909
  • [7] Linear Supervised Transfer Learning for the Large Margin Nearest Neighbor Classifier
    Berger, Kolja
    Schulz, Alexander
    Paassen, Benjamin
    Hammer, Barbara
    [J]. 2017 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2017, : 1842 - 1847
  • [8] Feature selection based on loss-margin of nearest neighbor classification
    Li, Yun
    Lu, Bao-Liang
    [J]. PATTERN RECOGNITION, 2009, 42 (09) : 1914 - 1921
  • [9] Feature Extraction Based on Maximum Nearest Subspace Margin Criterion
    Chen, Yi
    Li, Zhenzhen
    Jin, Zhong
    [J]. NEURAL PROCESSING LETTERS, 2013, 37 (03) : 355 - 375
  • [10] Feature Extraction Based on Maximum Nearest Subspace Margin Criterion
    Yi Chen
    Zhenzhen Li
    Zhong Jin
    [J]. Neural Processing Letters, 2013, 37 : 355 - 375