Weighted least squares support vector machines: robustness and sparse approximation

被引:1022
|
作者
Suykens, JAK [1 ]
De Brabanter, J [1 ]
Lukas, L [1 ]
Vandewalle, J [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn, ESAT SISTA, B-3001 Louvain, Heverlee, Belgium
关键词
support vector machines; (weighted) least squares; ridge regression; sparse approximation; robust estimation;
D O I
10.1016/S0925-2312(01)00644-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Least squares support vector machines (LS-SVM) is an SVM version which involves equality instead of inequality constraints and works with a least squares cost function. In this way, the solution follows from a linear Karush-Kuhn-Tucker system instead of a quadratic programming problem. However, sparseness is lost in the LS-SVM case and the estimation of the support values is only optimal in the case of a Gaussian distribution of the error variables. In this paper, we discuss a method which can overcome these two drawbacks. We show how to obtain robust estimates for regression by applying a weighted version of LS-SVM. We also discuss a sparse approximation procedure for weighted and unweighted LS-SVM. It is basically a pruning method which is able to do pruning based upon the physical meaning of the sorted support values, while pruning procedures for classical multilayer perceptrons require the computation of a Hessian matrix or its inverse. The methods of this paper are illustrated for RBF kernels and demonstrate how to obtain robust estimates with selection of an appropriate number of hidden units, in the case of outliers or non-Gaussian error distributions with heavy tails. (C) 2002 Elsevier Science B.V. All rights reserved.
引用
收藏
页码:85 / 105
页数:21
相关论文
共 50 条
  • [1] Sparse approximation using least squares support vector machines
    Suykens, JAK
    Lukas, L
    Vandewalle, J
    [J]. ISCAS 2000: IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS - PROCEEDINGS, VOL II: EMERGING TECHNOLOGIES FOR THE 21ST CENTURY, 2000, : 757 - 760
  • [2] A Novel Sparse Least Squares Support Vector Machines
    Xia, Xiao-Lei
    Jiao, Weidong
    Li, Kang
    Irwin, George
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
  • [3] Dynamic weighted least squares support vector machines
    Fan, Yu-Gang
    Li, Ping
    Song, Zhi-Huan
    [J]. Kongzhi yu Juece/Control and Decision, 2006, 21 (10): : 1129 - 1133
  • [4] A hybrid approach for sparse Least Squares Support Vector Machines
    de Carvalho, BPR
    Lacerda, WS
    Braga, AP
    [J]. HIS 2005: 5TH INTERNATIONAL CONFERENCE ON HYBRID INTELLIGENT SYSTEMS, PROCEEDINGS, 2005, : 323 - 328
  • [5] A hybrid approach for sparse least squares support vector machines
    [J]. De Carvalho, B.P.R. (bernardo@vettalabs.com), Operador Nacional do Sistema Eletrico - ONS; Coordenacao de Aperfeicoamento de Pessoal de Nivel Superior (Inst. of Elec. and Elec. Eng. Computer Society, 445 Hoes Lane - P.O.Box 1331, Piscataway, NJ 08855-1331, United States):
  • [6] Active Learning for Sparse Least Squares Support Vector Machines
    Zou, Junjie
    Yu, Zhengtao
    Zong, Huanyun
    Zhao, Xing
    [J]. ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT II, 2011, 7003 : 672 - +
  • [7] Improved sparse least-squares support vector machines
    Cawley, GC
    Talbot, NLC
    [J]. NEUROCOMPUTING, 2002, 48 : 1025 - 1031
  • [8] Efficient Sparse Least Squares Support Vector Machines for Regression
    Si Gangquan
    Shi Jianquan
    Guo Zhang
    Zhao Weili
    [J]. 2014 33RD CHINESE CONTROL CONFERENCE (CCC), 2014, : 5173 - 5178
  • [9] A Novel Sparse Weighted Least squares Support Vector Classifier
    Bo, Yang
    Liang, Zhang
    [J]. MATERIALS, MECHANICAL AND MANUFACTURING ENGINEERING, 2014, 842 : 746 - 749
  • [10] Fast sparse approximation for least squares support vector machine
    Jiao, Licheng
    Bo, Liefeng
    Wang, Ling
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (03): : 685 - 697