SMO-based pruning methods for sparse least squares support vector machines

被引:75
|
作者
Zeng, XY
Chen, XW [1 ]
机构
[1] Calif State Univ Northridge, Dept Elect & Comp Engn, Northridge, CA 91003 USA
[2] Univ Kansas, Dept Elect Engn & Comp Sci, Lawrence, KS 66045 USA
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2005年 / 16卷 / 06期
关键词
least squares support vector machine; pruning; sequential minimal optimization (SMO); sparseness;
D O I
10.1109/TNN.2005.852239
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal optimization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computational cost and classification accuracy is demonstrated by numerical experiments.
引用
收藏
页码:1541 / 1546
页数:6
相关论文
共 50 条
  • [1] A comparison of pruning algorithms for sparse least squares support vector machines
    Hoegaerts, L
    Suykens, JAK
    Vandewalle, J
    De Moor, B
    [J]. NEURAL INFORMATION PROCESSING, 2004, 3316 : 1247 - 1253
  • [2] A Novel Sparse Least Squares Support Vector Machines
    Xia, Xiao-Lei
    Jiao, Weidong
    Li, Kang
    Irwin, George
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
  • [3] Pruning error minimization in least squares support vector machines
    de Kruif, BJ
    de Vries, TJA
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2003, 14 (03): : 696 - 702
  • [4] Single Directional SMO Algorithm for Least Squares Support Vector Machines
    Shao, Xigao
    Wu, Kun
    Liao, Bifeng
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2013, 2013
  • [5] A hybrid approach for sparse Least Squares Support Vector Machines
    de Carvalho, BPR
    Lacerda, WS
    Braga, AP
    [J]. HIS 2005: 5TH INTERNATIONAL CONFERENCE ON HYBRID INTELLIGENT SYSTEMS, PROCEEDINGS, 2005, : 323 - 328
  • [6] A hybrid approach for sparse least squares support vector machines
    [J]. De Carvalho, B.P.R. (bernardo@vettalabs.com), Operador Nacional do Sistema Eletrico - ONS; Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (Inst. of Elec. and Elec. Eng. Computer Society, 445 Hoes Lane - P.O.Box 1331, Piscataway, NJ 08855-1331, United States):
  • [7] Active Learning for Sparse Least Squares Support Vector Machines
    Zou, Junjie
    Yu, Zhengtao
    Zong, Huanyun
    Zhao, Xing
    [J]. ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT II, 2011, 7003 : 672 - +
  • [8] Improved sparse least-squares support vector machines
    Cawley, GC
    Talbot, NLC
    [J]. NEUROCOMPUTING, 2002, 48 : 1025 - 1031
  • [9] Sparse approximation using least squares support vector machines
    Suykens, JAK
    Lukas, L
    Vandewalle, J
    [J]. ISCAS 2000: IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS - PROCEEDINGS, VOL II: EMERGING TECHNOLOGIES FOR THE 21ST CENTURY, 2000, : 757 - 760
  • [10] Efficient Sparse Least Squares Support Vector Machines for Regression
    Si Gangquan
    Shi Jianquan
    Guo Zhang
    Zhao Weili
    [J]. 2014 33RD CHINESE CONTROL CONFERENCE (CCC), 2014, : 5173 - 5178