Implementation Issues of Second-Order Cone Programming Approaches for Support Vector Machine Learning Problems

被引:0
|
作者
Debnath, Rameswar [1 ]
Muramatsu, Masakazu [2 ]
Takahashi, Haruhisa [3 ]
机构
[1] Khulna Univ, Comp Sci & Engn Discipline, Khulna 9208, Bangladesh
[2] Univ Electrocommun, Dept Comp Sci, Chofu, Tokyo 1828585, Japan
[3] Univ Electrocommun, Dept Informat & Commun Engn, Chofu, Tokyo 1828585, Japan
关键词
support vector machine; kernel matrix; second-order cone programming; HKM search direction; AHO search direction; INTERIOR-POINT METHODS; CONVERGENCE; ALGORITHMS;
D O I
10.1587/transfun.E92.A.1209
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The core of the support vector machine (SVM) problem is a quadratic programming problem with a linear constraint and bounded variables. This problem can be transformed into the second order cone programming (SOCP) problems. An interior-point-method (IPM) can be designed for the SOCP problems in terms of storage requirements as well as computational complexity if the kernel matrix has low-rank. If the kernel matrix is not a low-rank matrix, it can be approximated by a low-rank positive semi-definite matrix, which in turn will be fed into the optimizer. In this paper we present two SOCP formulations for each SVM classification and regression problem. There are several search direction methods for implementing SOCPs. Our main goal is to find a better search direction for implementing the SOCP formulations of the SVM problems. Two popular search direction methods: HKM and AHO are tested analytically for the SVM problems, and efficiently implemented. The computational costs of each iteration of the HKM and AHO search direction methods are shown to be the same for the SVM problems. Thus, the training time depends on the number of IPM iterations. Our experimental results show that the HKM method converges faster than the AHO method. We also compare our results with the method proposed in Fine and Scheinberg (2001) that also exploits the low-rank of the kernel matrix, the state-of-the-art SVM optimization softwares SVMTorch and SVMlight. The proposed methods are also compared with Joachims 'Linear SVM' method on linear kernel.
引用
收藏
页码:1209 / 1222
页数:14
相关论文
共 50 条
  • [21] Support vector machine classification with noisy data: a second order cone programming approach
    Trafalis, Theodore B.
    Alwazzi, Samir A.
    INTERNATIONAL JOURNAL OF GENERAL SYSTEMS, 2010, 39 (07) : 757 - 781
  • [22] Second-order cone programming
    F. Alizadeh
    D. Goldfarb
    Mathematical Programming, 2003, 95 : 3 - 51
  • [23] Second-order cone programming
    Alizadeh, F
    Goldfarb, D
    MATHEMATICAL PROGRAMMING, 2003, 95 (01) : 3 - 51
  • [24] Robust feature selection for multiclass Support Vector Machines using second-order cone programming
    Lopez, Julio
    Maldonado, Sebastian
    INTELLIGENT DATA ANALYSIS, 2015, 19 : S117 - S133
  • [25] An embedded feature selection approach for support vector classification via second-order cone programming
    Maldonado, Sebastian
    Lopez, Julio
    INTELLIGENT DATA ANALYSIS, 2015, 19 (06) : 1259 - 1273
  • [26] Second-order variational analysis in second-order cone programming
    Nguyen T. V. Hang
    Boris S. Mordukhovich
    M. Ebrahim Sarabi
    Mathematical Programming, 2020, 180 : 75 - 116
  • [27] Second-order variational analysis in second-order cone programming
    Hang, Nguyen T. V.
    Mordukhovich, Boris S.
    Sarabi, M. Ebrahim
    MATHEMATICAL PROGRAMMING, 2020, 180 (1-2) : 75 - 116
  • [28] Efficient hyperkernel learning using second-order cone programming
    Tsang, IW
    Kwok, JT
    MACHINE LEARNING: ECML 2004, PROCEEDINGS, 2004, 3201 : 453 - 464
  • [29] Efficient hyperkernel learning using second-order cone programming
    Tsang, IWH
    Kwok, JTY
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2006, 17 (01): : 48 - 58
  • [30] Applications of second-order cone programming
    Lobo, MS
    Vandenberghe, L
    Boyd, S
    Lebret, H
    LINEAR ALGEBRA AND ITS APPLICATIONS, 1998, 284 (1-3) : 193 - 228