A hybrid approach for sparse Least Squares Support Vector Machines

被引:0
|
作者
de Carvalho, BPR [1 ]
Lacerda, WS [1 ]
Braga, AP [1 ]
机构
[1] Vetta Labs, Div Res & Dev, BR-30110090 Belo Horizonte, MG, Brazil
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present in this work a hybrid strategy for training Least Squares Support Vector Machines (LS-SVMs), in order to eliminate their greatest drawback when comparing to original Support Vector Machines (SVMs), the inexistence of support vectors' automatic detection, the so called loss of sparseness. The main characteristic of LS-SVMs is the low computational complexity comparing to SVMs, without quality loss in the solution, because the principles that both have been based are the same. In this work, we use a sample selection technique called Reduced Remaining Subset (RRS), which is based on a modified nearest neighbor rule, in order to choose the best samples to represent each class. After that, LS-SVMs use the selected samples as support vectors to find the decision surface between the classes. Some experiments are presented to compare the proposed approach with two existent methods that also aim to impose sparseness in LS-SVMs.
引用
收藏
页码:323 / 328
页数:6
相关论文
共 50 条
  • [1] A Novel Sparse Least Squares Support Vector Machines
    Xia, Xiao-Lei
    Jiao, Weidong
    Li, Kang
    Irwin, George
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2013, 2013
  • [2] Active Learning for Sparse Least Squares Support Vector Machines
    Zou, Junjie
    Yu, Zhengtao
    Zong, Huanyun
    Zhao, Xing
    [J]. ARTIFICIAL INTELLIGENCE AND COMPUTATIONAL INTELLIGENCE, PT II, 2011, 7003 : 672 - +
  • [3] Sparse approximation using least squares support vector machines
    Suykens, JAK
    Lukas, L
    Vandewalle, J
    [J]. ISCAS 2000: IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS - PROCEEDINGS, VOL II: EMERGING TECHNOLOGIES FOR THE 21ST CENTURY, 2000, : 757 - 760
  • [4] Efficient Sparse Least Squares Support Vector Machines for Regression
    Si Gangquan
    Shi Jianquan
    Guo Zhang
    Zhao Weili
    [J]. 2014 33RD CHINESE CONTROL CONFERENCE (CCC), 2014, : 5173 - 5178
  • [5] Improved sparse least-squares support vector machines
    Cawley, GC
    Talbot, NLC
    [J]. NEUROCOMPUTING, 2002, 48 : 1025 - 1031
  • [6] Training sparse least squares support vector machines by the QR decomposition
    Xia, Xiao-Lei
    [J]. NEURAL NETWORKS, 2018, 106 : 175 - 184
  • [7] A comparison of pruning algorithms for sparse least squares support vector machines
    Hoegaerts, L
    Suykens, JAK
    Vandewalle, J
    De Moor, B
    [J]. NEURAL INFORMATION PROCESSING, 2004, 3316 : 1247 - 1253
  • [8] Fast Sparse Least Squares Support Vector Machines by Block Addition
    Ebuchi, Fumito
    Kitamura, Takuya
    [J]. ADVANCES IN NEURAL NETWORKS, PT I, 2017, 10261 : 60 - 70
  • [9] Efficient sparse least squares support vector machines for pattern classification
    Tian, Yingjie
    Ju, Xuchan
    Qi, Zhiquan
    Shi, Yong
    [J]. COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2013, 66 (10) : 1935 - 1947
  • [10] Weighted least squares support vector machines: robustness and sparse approximation
    Suykens, JAK
    De Brabanter, J
    Lukas, L
    Vandewalle, J
    [J]. NEUROCOMPUTING, 2002, 48 : 85 - 105