Efficient Large Margin-Based Feature Extraction

被引:0
|
作者
Guodong Zhao
Yan Wu
机构
[1] Tongji University,College of Electronics and Information Engineering
[2] Shanghai Dian Ji University,School of Mathematics and Physics
来源
Neural Processing Letters | 2019年 / 50卷
关键词
Dimensionality reduction; Weighted local modularity; Large margin; Hard samples;
D O I
暂无
中图分类号
学科分类号
摘要
As known, the supervised feature extraction aims to search a discriminative low dimensional space where the new samples in the sample class cluster tightly and the samples in the different classes keep away from each other. For most of algorithms, how to push these samples located in class margin or in other class (called hard samples in this paper) towards the class is difficult during the transformation. Frequently, these hard samples affect the performance of most of methods. Therefore, for an efficient method, to deal with these hard samples is very important. However, fewer methods in the past few years have been specially proposed to solve the problem of hard samples. In this study, the large margin nearest neighbor (LMNN) and weighted local modularity (WLM) in complex network are introduced respectively to deal with these hard samples in order to push them towards the class quickly and the samples with the same labels as a whole shrink into the class, which both result in small within-class distance and large margin between classes. Combined WLM with LMNN, a novel feature extraction method named WLMLMNN is proposed, which takes into account both the global and local consistencies of input data in the projected space. Comparative experiments with other popular methods on various real-world data sets demonstrate the effectiveness of the proposed method.
引用
收藏
页码:1257 / 1279
页数:22
相关论文
共 50 条
  • [31] Transfer Efficiency Analysis of Margin-Based Programs
    Rude, James
    Ker, Alan
    [J]. CANADIAN JOURNAL OF AGRICULTURAL ECONOMICS-REVUE CANADIENNE D AGROECONOMIE, 2013, 61 (04): : 509 - 529
  • [32] Margin-Based Active Learning of Multiclass Classifiers
    Bressan, Marco
    Cesa-Bianchi, Nicolo
    Lattanzi, Silvio
    Paudice, Andrea
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [33] Margin-based active learning for structured predictions
    Small, Kevin
    Roth, Dan
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2010, 1 (1-4) : 3 - 25
  • [34] A Margin-based MLE for Crowdsourced Partial Ranking
    Xu, Qianqian
    Xiong, Jiechao
    Sun, Xinwei
    Yang, Zhiyong
    Cao, Xiaochun
    Huang, Qingming
    Yao, Yuan
    [J]. PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 591 - 599
  • [35] Sparse margin–based discriminant analysis for feature extraction
    Zhenghong Gu
    Jian Yang
    [J]. Neural Computing and Applications, 2013, 23 : 1523 - 1529
  • [36] Margin-based Sampling in Deep Metric Learning
    Zhou, Shangwei
    Yu, Qingsong
    Sun, Jun
    [J]. ICBDC 2019: PROCEEDINGS OF 2019 4TH INTERNATIONAL CONFERENCE ON BIG DATA AND COMPUTING, 2019, : 277 - 280
  • [37] Margin-Based Discriminative Training for String Recognition
    Heigold, Georg
    Dreuw, Philippe
    Hahn, Stefan
    Schlueter, Ralf
    Ney, Hermann
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2010, 4 (06) : 917 - 925
  • [38] A note on margin-based loss functions in classification
    Lin, Y
    [J]. STATISTICS & PROBABILITY LETTERS, 2004, 68 (01) : 73 - 82
  • [39] On Margin-Based Cluster Recovery with Oracle Queries
    Bressan, Marco
    Cesa-Bianchi, Nicolo
    Lattanzi, Silvio
    Paudice, Andrea
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [40] Margin-based ranking meets boosting in the middle
    Rudin, C
    Cortes, C
    Mohri, M
    Schapire, RE
    [J]. LEARNING THEORY, PROCEEDINGS, 2005, 3559 : 63 - 78