Training soft margin support vector machines by simulated annealing: A dual approach

被引:20
|
作者
Dantas Dias, Madson L. [1 ]
Rocha Neto, Ajalmar R. [1 ]
机构
[1] Fed Inst Ceara IFCE, Dept Teleinformat, Av Treze Maio 2081, BR-60040215 Fortaleza, Ceara, Brazil
关键词
Support vector machines; Simulated annealing; Learning methods;
D O I
10.1016/j.eswa.2017.06.016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A theoretical advantage of support vector machines (SVM) is the empirical and structural risk minimization which balances the complexity of the model against its success at fitting the training data. Meta heuristics have mostly been used with support vector machines to either tune hyperparameters or to perform feature selection. In this paper, we present a new approach to obtain sparse support vector machines (SVM) based on simulated annealing (SA), named SATE. In our proposal, SA was used to solve the quadratic optimization problem that emerges from support vector machines rather than tune the hyperparameters. We have compared our proposal with sequential minimal optimization (SMO), kernel adatron (KA), a usual QP solver, as well as with recent Particle Swarm Optimization (PSO) and Genetic Algorithms(GA)-based versions. Generally speaking, one can infer that the SATE is equivalent to SMO in terms of accuracy and mean of support vectors and sparser than KA, QP, LPSO, and GA. SATE also has higher accuracies than the GA and PSO-based versions. Moreover, SATE successfully embedded the SVM constraints and provides a competitive classifier while maintaining its simplicity and high sparseness in the solution. (C) 2017 Elsevier Ltd. All rights reserved.
引用
收藏
页码:157 / 169
页数:13
相关论文
共 50 条
  • [41] Quantum optimization for training support vector machines
    Anguita, D
    Ridella, S
    Rivieccio, F
    Zunino, R
    NEURAL NETWORKS, 2003, 16 (5-6) : 763 - 770
  • [42] An explicit algorithm for training support vector machines
    Mattera, D
    Palmieri, F
    Haykin, S
    IEEE SIGNAL PROCESSING LETTERS, 1999, 6 (09) : 243 - 245
  • [43] An improved training algorithm for support vector machines
    Osuna, E
    Freund, R
    Girosi, F
    NEURAL NETWORKS FOR SIGNAL PROCESSING VII, 1997, : 276 - 285
  • [44] Deep Features for Training Support Vector Machines
    Nanni, Loris
    Ghidoni, Stefano
    Brahnam, Sheryl
    JOURNAL OF IMAGING, 2021, 7 (09)
  • [45] Fast training of support vector machines for regression
    Anguita, D
    Boni, A
    Pace, S
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL V, 2000, : 210 - 214
  • [46] Parameter determination of support vector machine and feature selection using simulated annealing approach
    Lin, Shih-Wei
    Lee, Zne-Jung
    Chen, Shih-Chieh
    Tseng, Tsung-Yuan
    APPLIED SOFT COMPUTING, 2008, 8 (04) : 1505 - 1512
  • [47] F-SVC: A SIMPLE AND FAST TRAINING ALGORITHM SOFT MARGIN SUPPORT VECTOR CLASSIFICATION
    Tohme, Mireille
    Lengelle, Regis
    2008 IEEE WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, 2008, : 339 - +
  • [48] A new training method for support vector machines:: Clustering k-NN support vector machines
    Comak, Emre
    Arslan, Ahmet
    EXPERT SYSTEMS WITH APPLICATIONS, 2008, 35 (03) : 564 - 568
  • [49] Effective training of support vector machines using extractive support vector algorithm
    Yao, Chih-Chia
    Yu, Pao-Ta
    PROCEEDINGS OF 2007 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2007, : 1808 - +
  • [50] Support Vector Machines with the Ramp Loss and the Hard Margin Loss
    Brooks, J. Paul
    OPERATIONS RESEARCH, 2011, 59 (02) : 467 - 479