On L1-norm multiclass support vector machines:: Methodology and theory

被引:71
|
作者
Wang, Lifeng [1 ]
Shen, Xiaotong [1 ]
机构
[1] Univ Minnesota, Sch Stat, Minneapolis, MN 55455 USA
基金
美国国家科学基金会;
关键词
high-dimension but low sample size; margin classification; regularization; sparsity; variable selection;
D O I
10.1198/016214506000001383
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Binary support vector machines (SVMs) have been proven to deliver high performance. In multiclass classification, however, issues remain with respect to variable selection. One challenging issue is classification and variable selection in the presence of variables in the magnitude of thousands, greatly exceeding the size of training sample. This often occurs in genomics classification. To meet the challenge, this article proposes a novel multiclass support vector machine, which performs classification and variable selection simultaneously through an L-1-norm penalized sparse representation. The proposed methodology, together with the developed regularization solution path, permits variable selection in such a situation. For the proposed methodology, a statistical learning theory is developed to quantify the generalization error in an attempt to gain insight into the basic structure of sparse learning, permitting the number of variables to greatly exceed the sample size. The operating characteristics of the methodology are examined through both simulated and benchmark data and are compared against some competitors in terms of accuracy of prediction. The numerical results suggest that the proposed methodology is highly competitive.
引用
收藏
页码:583 / 594
页数:12
相关论文
共 50 条
  • [1] On L1-norm multi-class Support Vector Machines
    Wang, Lifeng
    Shen, Xiaotong
    Zheng, Yuan F.
    [J]. ICMLA 2006: 5TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, PROCEEDINGS, 2006, : 83 - +
  • [2] Statistical margin error bounds for L1-norm support vector machines
    Chen, Liangzhi
    Zhang, Haizhang
    [J]. NEUROCOMPUTING, 2019, 339 : 210 - 216
  • [3] Non-Asymptotic Analysis of l1-Norm Support Vector Machines
    Kolleck, Anton
    Vybiral, Jan
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (09) : 5461 - 5476
  • [4] L1-norm twin support vector quantile regression
    Ye, Ya-Fen
    Wang, Chen-Xuan
    Tian, Jia-Sen
    Chen, Wei-Jie
    [J]. Applied Soft Computing, 2025, 169
  • [5] Capped L1-Norm Proximal Support Vector Machine
    Ren, Pei-Wei
    Li, Chun-Na
    Shao, Yuan-Hai
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [6] Robust capped L1-norm twin support vector machine
    Wang, Chunyan
    Ye, Qiaolin
    Luo, Peng
    Ye, Ning
    Fu, Liyong
    [J]. NEURAL NETWORKS, 2019, 114 : 47 - 59
  • [7] l1-norm Nonparallel Support Vector Machine for PU Learning
    Bai, Fusheng
    Yuan, Yongjia
    [J]. 2018 IEEE 23RD INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2018,
  • [8] Least squares twin bounded support vector machines based on L1-norm distance metric for classification
    Yan, He
    Ye, Qiaolin
    Zhang, Tian'an
    Yu, Dong-Jun
    Yuan, Xia
    Xu, Yiqing
    Fu, Liyong
    [J]. PATTERN RECOGNITION, 2018, 74 : 434 - 447
  • [9] ROBUST CAPPED L1-NORM PROJECTION TWIN SUPPORT VECTOR MACHINE
    Yang, Linxi
    Wang, Yan
    LI, Guoquan
    [J]. JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2023, 19 (08) : 5797 - 5815
  • [10] l1-Norm support vector machine for ranking with exponentially strongly mixing sequence
    Chen, Di-Rong
    Huang, Shou-You
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2014, 12 (05)