Statistical margin error bounds for L1-norm support vector machines

被引:7
|
作者
Chen, Liangzhi [1 ]
Zhang, Haizhang [1 ,2 ]
机构
[1] Sun Yat Sen Univ, Sch Data & Comp Sci, Guangzhou 510006, Guangdong, Peoples R China
[2] Sun Yat Sen Univ, Guangdong Prov Key Lab Computat Sci, Guangzhou 510006, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Margin error bounds; L-1-norm support vector machines; Geometrical interpretation; The fat-shattering dimension; The classification hyperplane; KERNEL BANACH-SPACES;
D O I
10.1016/j.neucom.2019.02.015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Comparing with L-p-norm (1 < p < + infinity) Support Vector Machines (SVMs), the L-1-norm SVM enjoys the nice property of simultaneously performing classification and feature selection. Margin error bounds for SVM on Hilbert spaces (or on more general q-uniformly smooth Banach spaces) have been obtained in the literature to justify the strategy of maximizing the margin in SVM. In this paper, we devote to estimating the margin error bound for L-1-norm SVM methods and giving a geometrical interpretation for the result. We show that the fat-shattering dimension of the Banach spaces l(1) and l(infinity) are both infinite. Therefore, we establish margin error bounds for the SVM on finite dimensional spaces with L-1-norm, thus supplying statistical justification for the large margin classification of L-1-norm SVM on finite dimensional spaces. To complete the theory, corresponding results for the L-infinity-norm SVM are also presented. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:210 / 216
页数:7
相关论文
共 50 条
  • [21] L1-Norm Support Vector Regression in Primal Based on Huber Loss Function
    Puthiyottil, Anagha
    Balasundaram, S.
    Meena, Yogendra
    PROCEEDINGS OF ICETIT 2019: EMERGING TRENDS IN INFORMATION TECHNOLOGY, 2020, 605 : 193 - 203
  • [22] Sparse L1-norm quadratic surface support vector machine with Universum data
    Hossein Moosaei
    Ahmad Mousavi
    Milan Hladík
    Zheming Gao
    Soft Computing, 2023, 27 : 5567 - 5586
  • [23] Robust L1-norm non-parallel proximal support vector machine
    Li, Chun-Na
    Shao, Yuan-Hai
    Deng, Nai-Yang
    OPTIMIZATION, 2016, 65 (01) : 169 - 183
  • [24] Sparse L1-norm quadratic surface support vector machine with Universum data
    Moosaei, Hossein
    Mousavi, Ahmad
    Hladik, Milan
    Gao, Zheming
    SOFT COMPUTING, 2023, 27 (09) : 5567 - 5586
  • [25] 1-norm support vector machines
    Zhu, J
    Rosset, S
    Hastie, T
    Tibshirani, R
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 16, 2004, 16 : 49 - 56
  • [26] An l1-norm loss based twin support vector regression and its geometric extension
    Peng, Xinjun
    Chen, De
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (09) : 2573 - 2588
  • [27] Robust L1-norm multi-weight vector projection support vector machine with efficient algorithm
    Chen, Wei-Jie
    Li, Chun-Na
    Shao, Yuan-Hai
    Zhang, Ju
    Deng, Nai-Yang
    NEUROCOMPUTING, 2018, 315 : 345 - 361
  • [28] Low Rank Approximation with Entrywise l1-Norm Error
    Song, Zhao
    Woodruff, David P.
    Zhong, Peilin
    STOC'17: PROCEEDINGS OF THE 49TH ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING, 2017, : 688 - 701
  • [29] On the advantages of weighted L1-norm support vector learning for unbalanced binary classification problems
    Eitrich, Tatjana
    Lang, Bruno
    2006 3RD INTERNATIONAL IEEE CONFERENCE INTELLIGENT SYSTEMS, VOLS 1 AND 2, 2006, : 565 - 570
  • [30] Robust image recognition by L1-norm twin-projection support vector machine
    Gu, Zhenfeng
    Zhang, Zhao
    Sun, Jiabao
    Li, Bing
    NEUROCOMPUTING, 2017, 223 : 1 - 11