Sparse and robust support vector machine with capped squared loss for large-scale pattern classification

被引:0
|
作者
Wang, Huajun [1 ]
Zhang, Hongwei [1 ]
Li, Wenqian [2 ]
机构
[1] Changsha Univ Sci & Technol, Dept Math & Stat, Changsha, Peoples R China
[2] Hunan Normal Univ, Coll Life Sci, Changsha, Peoples R China
关键词
Capped squared loss; Fast algorithm; Support vectors; Low computational complexity; Working set;
D O I
10.1016/j.patcog.2024.110544
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Support vector machine (SVM), being considered one of the most efficient tools for classification, has received widespread attention in various fields. However, its performance is hindered when dealing with large-scale pattern classification tasks due to high memory requirements and running very slow. To address this challenge, we construct a novel sparse and robust SVM based on our newly proposed capped squared loss (named as L-csl-SVM). To solve L-csl-SVM, we first focus on establishing optimality theory of L-csl-SVM via our defined proximal stationary point, which is convenient for us to efficiently characterize the L-csl support vectors of L-csl-SVM. We subsequently demonstrate that the L-csl support vectors comprise merely a minor fraction of entire training data. This observation leads us to introduce the concept of the working set. Furthermore, we design a novel subspace fast algorithm with working set (named as L-csl-ADMM) for solving L-csl-SVM, which is proven that L-csl-ADMM has both global convergence and relatively low computational complexity. Finally, numerical experiments show that L-csl-ADMM has excellent performances in terms of getting the best classification accuracy, using the shortest time and presenting the smallest numbers of support vectors when solving large-scale pattern classification problems.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Weighted linear loss twin support vector machine for large-scale classification
    Shao, Yuan-Hai
    Chen, Wei-Jie
    Wang, Zhen
    Li, Chun-Na
    Deng, Nai-Yang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2015, 73 : 276 - 288
  • [2] LINEX Support Vector Machine for Large-Scale Classification
    Ma, Yue
    Zhang, Qin
    Li, Dewei
    Tian, Yingjie
    [J]. IEEE ACCESS, 2019, 7 : 70319 - 70331
  • [3] Large-scale support vector machine classification with redundant data reduction
    Shen, Xiang-Jun
    Mu, Lei
    Li, Zhen
    Wu, Hao-Xiang
    Gou, Jian-Ping
    Chen, Xin
    [J]. NEUROCOMPUTING, 2016, 172 : 189 - 197
  • [4] Robust twin support vector machine for pattern classification
    Qi, Zhiquan
    Tian, Yingjie
    Shi, Yong
    [J]. PATTERN RECOGNITION, 2013, 46 (01) : 305 - 316
  • [5] Capped Linex Metric Twin Support Vector Machine for Robust Classification
    Wang, Yifan
    Yu, Guolin
    Ma, Jun
    [J]. SENSORS, 2022, 22 (17)
  • [6] Robust weighted linear loss twin multi-class support vector regression for large-scale classification
    Qiang, Wenwen
    Zhang, Jinxin
    Zhen, Ling
    Jing, Ling
    [J]. SIGNAL PROCESSING, 2020, 170
  • [7] Cluster Reduction Support Vector Machine for Large-scale Data Set Classification
    Chen, Guangxi
    Cheng, Yan
    Xu, Jian
    [J]. PACIIA: 2008 PACIFIC-ASIA WORKSHOP ON COMPUTATIONAL INTELLIGENCE AND INDUSTRIAL APPLICATION, VOLS 1-3, PROCEEDINGS, 2008, : 6 - +
  • [8] Robust Parametric Twin Support Vector Machine for Pattern Classification
    Rastogi, Reshma
    Sharma, Sweta
    Chandra, Suresh
    [J]. NEURAL PROCESSING LETTERS, 2018, 47 (01) : 293 - 323
  • [9] Robust Parametric Twin Support Vector Machine for Pattern Classification
    Reshma Rastogi
    Sweta Sharma
    Suresh Chandra
    [J]. Neural Processing Letters, 2018, 47 : 293 - 323
  • [10] Large scale classification with support vector machine algorithms
    Do, Thanh-Nghi
    Fekete, Jean-Daniel
    [J]. ICMLA 2007: SIXTH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, PROCEEDINGS, 2007, : 7 - 12