Sparse least squares support vector training in the reduced empirical feature space

被引:0
|
作者
Shigeo Abe
机构
[1] Kobe University,Graduate School of Science and Technology
来源
关键词
Cholesky factorization; Empirical feature space; Least squares support vector machines; Multi-class support vector machines; Pattern classification; RBF kernels; Support vector machines;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper we discuss sparse least squares support vector machines (sparse LS SVMs) trained in the empirical feature space, which is spanned by the mapped training data. First, we show that the kernel associated with the empirical feature space gives the same value with that of the kernel associated with the feature space if one of the arguments of the kernels is mapped into the empirical feature space by the mapping function associated with the feature space. Using this fact, we show that training and testing of kernel-based methods can be done in the empirical feature space and that training of LS SVMs in the empirical feature space results in solving a set of linear equations. We then derive the sparse LS SVMs restricting the linearly independent training data in the empirical feature space by the Cholesky factorization. Support vectors correspond to the selected training data and they do not change even if the value of the margin parameter is changed. Thus for linear kernels, the number of support vectors is the number of input variables at most. By computer experiments we show that we can reduce the number of support vectors without deteriorating the generalization ability.
引用
收藏
页码:203 / 214
页数:11
相关论文
共 50 条
  • [41] Noninvasive arteriosclerosis detection by sparse Least squares Support vector Machine
    Wang, Chen
    Liu, Haikuan
    Gan, Liangzhi
    Xia, Haodi
    [J]. 2020 CHINESE AUTOMATION CONGRESS (CAC 2020), 2020, : 2424 - 2429
  • [42] Weighted least squares support vector machines: robustness and sparse approximation
    Suykens, JAK
    De Brabanter, J
    Lukas, L
    Vandewalle, J
    [J]. NEUROCOMPUTING, 2002, 48 : 85 - 105
  • [43] Sparse Least Squares Support Vector Machines via Genetic Algorithms
    Silva, Juliana Peixoto
    da Rocha Neto, Ajalmar R.
    [J]. 2013 1ST BRICS COUNTRIES CONGRESS ON COMPUTATIONAL INTELLIGENCE AND 11TH BRAZILIAN CONGRESS ON COMPUTATIONAL INTELLIGENCE (BRICS-CCI & CBIC), 2013, : 248 - 253
  • [44] Least squares support vector machines and primal space estimation
    Espinoza, M
    Suykens, JAK
    De Moor, B
    [J]. 42ND IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-6, PROCEEDINGS, 2003, : 3451 - 3456
  • [45] Reduced least squares support vector based on kernel partial least squares and its application research
    Song Haiying
    Gui Weihua
    Yang Chunhua
    [J]. PROCEEDINGS OF THE 26TH CHINESE CONTROL CONFERENCE, VOL 3, 2007, : 207 - +
  • [46] Feature selection for least squares projection twin support vector machine
    Guo, Jianhui
    Yi, Ping
    Wang, Ruili
    Ye, Qiaolin
    Zhao, Chunxia
    [J]. NEUROCOMPUTING, 2014, 144 : 174 - 183
  • [47] Improved pruning algorithms for sparse least squares support vector regression machine
    Zhao, Yong-Ping
    Sun, Jian-Guo
    [J]. Xitong Gongcheng Lilun yu Shijian/System Engineering Theory and Practice, 2009, 29 (06): : 166 - 171
  • [48] An Improved Active Learning Sparse Least Squares Support Vector Machines for Regression
    Si Gangquan
    Shi Jianquan
    Guo Zhang
    Gao Hong
    [J]. 2015 27TH CHINESE CONTROL AND DECISION CONFERENCE (CCDC), 2015, : 4558 - 4562
  • [49] Fast pruning algorithm for designing sparse least squares support vector machine
    Zhou, Xin-Ran
    Teng, Zhao-Sheng
    Yi, Zhao
    [J]. Dianji yu Kongzhi Xuebao/Electric Machines and Control, 2009, 13 (04): : 626 - 630
  • [50] Fault diagnosis method of deep sparse least squares support vector machine
    Zhang, Rui
    Li, Ke
    Su, Lei
    Li, Wen-Rui
    [J]. Zhendong Gongcheng Xuebao/Journal of Vibration Engineering, 2019, 32 (06): : 1104 - 1113