Sparse L1-norm quadratic surface support vector machine with Universum data

被引:3
|
作者
Moosaei, Hossein [1 ,2 ]
Mousavi, Ahmad [3 ]
Hladik, Milan [4 ]
Gao, Zheming [5 ]
机构
[1] Univ JE Purkyne, Fac Sci, Dept Informat, Usti Nad Labem, Czech Republic
[2] Charles Univ Prague, Fac Math & Phys, Sch Comp Sci, Dept Appl Math, Prague, Czech Republic
[3] Univ Florida, Informat Inst, Gainesville, FL 32611 USA
[4] Charles Univ Prague, Fac Math & Phys, Dept Appl Math, Prague, Czech Republic
[5] Northeastern Univ, Coll Informat Sci & Engn, Shenyang 110819, Liaoning, Peoples R China
基金
中国国家自然科学基金;
关键词
Binary classification; Quadratic surface support vector machines; l(1) norm regularization; Least squares; Universum data;
D O I
10.1007/s00500-023-07860-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In binary classification, kernel-free quadratic support vector machines are proposed to avoid difficulties such as finding appropriate kernel functions or tuning their hyper-parameters. Furthermore, Universum data points, which do not belong to any class, can be exploited to embed prior knowledge into the corresponding models to improve the general performance. This paper designs novel kernel-free Universum quadratic surface support vector machine models. Further, this paper proposes the l(1) norm regularized version that is beneficial for detecting potential sparsity patterns in the Hessian of the quadratic surface and reducing to the standard linear models if the data points are (almost) linearly separable. The proposed models are convex, so standard numerical solvers can be utilized to solve them. Moreover, a least squares version of the l(1) norm regularized model is proposed. We also design an effective tailored algorithm that only requires solving one linear system. Several theoretical properties of these models are then reported and proved as well. The numerical results show that the least squares version of the proposed model achieves the highest mean accuracy scores with promising computational efficiency on some artificial and public benchmark data sets. Some statistical tests are conducted to show the competitiveness of the proposed models.
引用
收藏
页码:5567 / 5586
页数:20
相关论文
共 50 条
  • [21] On L1-norm multi-class Support Vector Machines
    Wang, Lifeng
    Shen, Xiaotong
    Zheng, Yuan F.
    ICMLA 2006: 5TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, PROCEEDINGS, 2006, : 83 - +
  • [22] An Error Bound for L1-norm Support Vector Machine Coefficients in Ultra-high Dimension
    Peng, Bo
    Wang, Lan
    Wu, Yichao
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [23] Stable feature selection based on the ensemble L1-norm support vector machine for biomarker discovery
    Myungjin Moon
    Kenta Nakai
    BMC Genomics, 17
  • [24] L1-norm loss-based projection twin support vector machine for binary classification
    Hua, Xiaopeng
    Xu, Sen
    Gao, Jun
    Ding, Shifei
    SOFT COMPUTING, 2019, 23 (21) : 10649 - 10659
  • [25] Stable feature selection based on the ensemble L1-norm support vector machine for biomarker discovery
    Moon, Myungjin
    Nakai, Kenta
    BMC GENOMICS, 2016, 17
  • [26] L1-norm loss-based projection twin support vector machine for binary classification
    Xiaopeng Hua
    Sen Xu
    Jun Gao
    Shifei Ding
    Soft Computing, 2019, 23 : 10649 - 10659
  • [27] ν-twin support vector machine with Universum data for classification
    Yitian Xu
    Mei Chen
    Zhiji Yang
    Guohui Li
    Applied Intelligence, 2016, 44 : 956 - 968
  • [28] Statistical margin error bounds for L1-norm support vector machines
    Chen, Liangzhi
    Zhang, Haizhang
    NEUROCOMPUTING, 2019, 339 : 210 - 216
  • [29] Non-Asymptotic Analysis of l1-Norm Support Vector Machines
    Kolleck, Anton
    Vybiral, Jan
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2017, 63 (09) : 5461 - 5476
  • [30] Divide-and-Conquer for Debiased l1-norm Support Vector Machine in Ultra-high Dimensions
    Lian, Heng
    Fan, Zengyan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18