Exploiting separability in large-scale linear support vector machine training

被引:12
|
作者
Woodsend, Kristian [1 ,2 ]
Gondzio, Jacek [1 ,2 ]
机构
[1] Univ Edinburgh, Sch Math, Edinburgh EH9 3JZ, Midlothian, Scotland
[2] Univ Edinburgh, Maxwell Inst Math Sci, Edinburgh EH9 3JZ, Midlothian, Scotland
关键词
Support vector machines; Interior point method; Separable quadratic program; Large scale; INTERIOR-POINT METHODS; SOFTWARE;
D O I
10.1007/s10589-009-9296-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Linear support vector machine training can be represented as a large quadratic program. We present an efficient and numerically stable algorithm for this problem using interior point methods, which requires only O (n) operations per iteration. Through exploiting the separability of the Hessian, we provide a unified approach, from an optimization perspective, to 1-norm classification, 2-norm classification, universum classification, ordinal regression and epsilon-insensitive regression. Our approach has the added advantage of obtaining the hyperplane weights and bias directly from the solver. Numerical experiments indicate that, in contrast to existing methods, the algorithm is largely unaffected by noisy data, and they show training times for our implementation are consistent and highly competitive. We discuss the effect of using multiple correctors, and monitoring the angle of the normal to the hyperplane to determine termination.
引用
收藏
页码:241 / 269
页数:29
相关论文
共 50 条
  • [1] Exploiting separability in large-scale linear support vector machine training
    Kristian Woodsend
    Jacek Gondzio
    [J]. Computational Optimization and Applications, 2011, 49 : 241 - 269
  • [2] Large-scale linear nonparallel support vector machine solver
    Tian, Yingjie
    Ping, Yuan
    [J]. NEURAL NETWORKS, 2014, 50 : 166 - 174
  • [3] Large-scale linear nonparallel support vector machine solver
    Tian, Yingjie
    Zhang, Qin
    Ping, Yuan
    [J]. NEUROCOMPUTING, 2014, 138 : 114 - 119
  • [4] Memory-efficient Large-scale Linear Support Vector Machine
    Alrajeh, Abdullah
    Takeda, Akiko
    Niranjan, Mahesan
    [J]. SEVENTH INTERNATIONAL CONFERENCE ON MACHINE VISION (ICMV 2014), 2015, 9445
  • [5] Large-scale Linear Support Vector Regression
    Ho, Chia-Hua
    Lin, Chih-Jen
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 3323 - 3348
  • [6] Weighted linear loss twin support vector machine for large-scale classification
    Shao, Yuan-Hai
    Chen, Wei-Jie
    Wang, Zhen
    Li, Chun-Na
    Deng, Nai-Yang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2015, 73 : 276 - 288
  • [7] LINEX Support Vector Machine for Large-Scale Classification
    Ma, Yue
    Zhang, Qin
    Li, Dewei
    Tian, Yingjie
    [J]. IEEE ACCESS, 2019, 7 : 70319 - 70331
  • [8] A CASCADING INCREMENTAL TRAINING APPROACH FOR LARGE-SCALE DISTRIBUTED DATA BASED ON SUPPORT VECTOR MACHINE
    Xu Yuanyuan
    Li Shucheng
    Li Fan
    Gu Xiaofeng
    Sun Rui
    [J]. 2020 17TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2020, : 130 - 133
  • [9] Large-scale Multiclass Support Vector Machine Training via Euclidean Projection onto the Simplex
    Blondel, Mathieu
    Fujino, Akinori
    Ueda, Naonori
    [J]. 2014 22ND INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2014, : 1289 - 1294
  • [10] Least Square Support Vector Machine for Large-scale Dataset
    Khanh Nguyen
    Trung Le
    Vinh Lai
    Duy Nguyen
    Dat Tran
    Ma, Wanli
    [J]. 2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,