Sparse and robust support vector machine with capped squared loss for large-scale pattern classification

被引:0
|
作者
Wang, Huajun [1 ]
Zhang, Hongwei [1 ]
Li, Wenqian [2 ]
机构
[1] Changsha Univ Sci & Technol, Dept Math & Stat, Changsha, Peoples R China
[2] Hunan Normal Univ, Coll Life Sci, Changsha, Peoples R China
关键词
Capped squared loss; Fast algorithm; Support vectors; Low computational complexity; Working set;
D O I
10.1016/j.patcog.2024.110544
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support vector machine (SVM), being considered one of the most efficient tools for classification, has received widespread attention in various fields. However, its performance is hindered when dealing with large-scale pattern classification tasks due to high memory requirements and running very slow. To address this challenge, we construct a novel sparse and robust SVM based on our newly proposed capped squared loss (named as L-csl-SVM). To solve L-csl-SVM, we first focus on establishing optimality theory of L-csl-SVM via our defined proximal stationary point, which is convenient for us to efficiently characterize the L-csl support vectors of L-csl-SVM. We subsequently demonstrate that the L-csl support vectors comprise merely a minor fraction of entire training data. This observation leads us to introduce the concept of the working set. Furthermore, we design a novel subspace fast algorithm with working set (named as L-csl-ADMM) for solving L-csl-SVM, which is proven that L-csl-ADMM has both global convergence and relatively low computational complexity. Finally, numerical experiments show that L-csl-ADMM has excellent performances in terms of getting the best classification accuracy, using the shortest time and presenting the smallest numbers of support vectors when solving large-scale pattern classification problems.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Large-scale linear nonparallel support vector machine solver
    Tian, Yingjie
    Zhang, Qin
    Ping, Yuan
    [J]. NEUROCOMPUTING, 2014, 138 : 114 - 119
  • [32] Large-scale linear nonparallel support vector machine solver
    Tian, Yingjie
    Ping, Yuan
    [J]. NEURAL NETWORKS, 2014, 50 : 166 - 174
  • [33] Fast sparse twin learning framework for large-scale pattern classification
    Wang, Haoyu
    Yu, Guolin
    Ma, Jun
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 130
  • [34] Nonparallel support vector machine with large margin distribution for pattern classification
    Liu, Liming
    Chu, Maoxiang
    Gong, Rongfen
    Peng, Yongcheng
    [J]. PATTERN RECOGNITION, 2020, 106
  • [35] A Fast and Robust Support Vector Machine With Anti-Noise Convex Hull and its Application in Large-Scale ncRNA Data Classification
    Gu, Xiaoqing
    Ni, Tongguang
    Fan, Yiqing
    [J]. IEEE ACCESS, 2019, 7 : 134730 - 134741
  • [36] Fused robust geometric nonparallel hyperplane support vector machine for pattern classification
    Gao, Ruiyao
    Qi, Kai
    Yang, Hu
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 236
  • [37] Weighted linear loss support vector machine for large scale problems
    Shao, Yuan-Hai
    Wang, Zhen
    Yang, Zhi-Min
    Deng, Nai-Yang
    [J]. 2ND INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND QUANTITATIVE MANAGEMENT, ITQM 2014, 2014, 31 : 639 - 647
  • [38] Robust Classification of PolSAR Images based on pinball loss support vector machine
    Zhang, Lamei
    Zhang, Siyu
    Dong, Hongwei
    Zhu, Sha
    [J]. Journal of Radars, 2019, 8 (04): : 448 - 457
  • [39] Large-scale classification by an Approximate Least Squares One-Class Support Vector Machine ensemble
    Mygdalis, Vasileios
    Iosifidis, Alexandros
    Tefas, Anastasios
    Pitas, Ioannis
    [J]. 2015 IEEE TRUSTCOM/BIGDATASE/ISPA, VOL 2, 2015, : 6 - 10
  • [40] Weighted Linear Loss Projection Twin Support Vector Machine for Pattern Classification
    Chen, Sugen
    Cao, Junfeng
    Huang, Zhong
    [J]. IEEE ACCESS, 2019, 7 : 57349 - 57360