Lagrangian support vector machines

被引:411
|
作者
Mangasaian, OL
Musicant, DR
机构
[1] Univ Wisconsin, Dept Comp Sci, Madison, WI 53706 USA
[2] Carleton Coll, Dept Math & Comp Sci, Northfield, MN 55057 USA
关键词
D O I
10.1162/15324430152748218
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An implicit Lagrangian for the dual of a simple reformulation of the standard quadratic program of a linear support vector machine is proposed. This leads to the minimization of an unconstrained differentiable convex function in a space of dimensionality equal to the number of classified points. This problem is solvable by an extremely simple linearly convergent Lagrangian support vector machine (LSVM) algorithm. LSVM requires the inversion at the outset of a single matrix of the order of the much smaller dimensionality of the original input space phis one. The full algorithm is given in this paper in 11 lines of MATLAB code without any special optimization tools such as linear or quadratic programming solvers. This LSVM code can be used "as is" to solve classification problems with millions of points. For example, 2 million points in 10 dimensional input space were classified by a linear surface in 82 minutes on a Pentium III 500 MHz notebook with 384 megabytes of memory (and additional swap space), and in 7 minutes on a 250 MHz UltraSPARC 11 processor with 2 gigabytes of memory. Other standard classification test problems were also solved. Nonlinear kernel classification can also be solved by LSVM. Although it does not scale up to very large problems, it can handle any positive semidefinite kernel and is guaranteed to converge. A short MATLAB code is also given for nonlinear kernels and tested on a number of problems.
引用
下载
收藏
页码:161 / 177
页数:17
相关论文
共 50 条
  • [21] Support vector machines and regularization
    Cherkassky, V
    Ma, YQ
    Seventh IASTED International Conference on Signal and Image Processing, 2005, : 166 - 171
  • [22] Ellipsoidal Support Vector Machines
    Momma, Michinari
    Hatano, Kohei
    Nakayama, Hiroki
    PROCEEDINGS OF 2ND ASIAN CONFERENCE ON MACHINE LEARNING (ACML2010), 2010, 13 : 31 - 46
  • [23] Nested support vector machines
    Lee, Gyemin
    Scott, Clayton
    2008 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, VOLS 1-12, 2008, : 1985 - 1988
  • [24] On Lagrangian twin support vector regression
    Balasundaram, S.
    Tanveer, M.
    NEURAL COMPUTING & APPLICATIONS, 2013, 22 : S257 - S267
  • [25] Minimax support vector machines
    Davenport, Mark A.
    Baraniuk, Richard G.
    Scott, Clayton D.
    2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 630 - +
  • [26] Nested Support Vector Machines
    Lee, Gyemin
    Scott, Clayton
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2010, 58 (03) : 1648 - 1660
  • [27] Sex with Support Vector Machines
    Moghaddam, B
    Yang, MH
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 13, 2001, 13 : 960 - 966
  • [28] Fβ support vector machines
    Callut, K
    Dupont, P
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), VOLS 1-5, 2005, : 1443 - 1448
  • [29] Oblique support vector machines
    Yao, CC
    Yu, PT
    PROCEEDINGS OF THE 2004 INTERNATIONAL SYMPOSIUM ON INTELLIGENT MULTIMEDIA, VIDEO AND SPEECH PROCESSING, 2004, : 699 - 702
  • [30] Selective support vector machines
    Onur Seref
    O. Erhun Kundakcioglu
    Oleg A. Prokopyev
    Panos M. Pardalos
    Journal of Combinatorial Optimization, 2009, 17 : 3 - 20