Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression

被引:258
|
作者
An, Senjian [1 ]
Liu, Wanquan [1 ]
Venkatesh, Svetha [1 ]
机构
[1] Curtin Univ Technol, Dept Comp, Perth, WA 6845, Australia
关键词
model selection; cross-validation; kernel methods;
D O I
10.1016/j.patcog.2006.12.015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Given n training examples, the training of a least squares support vector machine (LS-SVM) or kernel ridge regression (KRR) corresponds to solving a linear system of dimension n. In cross-validating LS-SVM or KRR, the training examples are split into two distinct subsets for a number of times (1) wherein a subset of in examples are used for validation and the other subset of (n - in) examples are used for training the classifier. In this case I linear systems of dimension (n - m) need to be solved. We propose a novel method for cross-validation (CV) of LS-SVM or KRR in which instead of solving I linear systems of dimension (n - m), we compute the inverse of an n dimensional square matrix and solve 1 linear systems of dimension in, thereby reducing the complexity when 1 is large and/or in is small. Typical multi-fold, leave-one-out cross-validation (LOO-CV) and leave-many-out cross-validations are considered. For five-fold CV used in practice with five repetitions over randomly drawn slices, the proposed algorithm is approximately four times as efficient as the naive implementation. For large data sets, we propose to evaluate the CV approximately by applying the well-known incomplete Cholesky decomposition technique and the complexity of these approximate algorithms will scale linearly on the data size if the rank of the associated kernel matrix is much smaller than n. Simulations are provided to demonstrate the performance of LS-SVM and the efficiency of the proposed algorithm with comparisons to the naive and some existent implementations of multi-fold and LOO-CV. (C) 2007 Pattern Recognition Society. Published by Elsevier Ltd. All rights reserved.
引用
收藏
页码:2154 / 2162
页数:9
相关论文
共 50 条
  • [21] Least Squares Support Vector Machine Regression with Equality Constraints
    Liu, Kun
    Sun, Bing-Yu
    [J]. INTERNATIONAL CONFERENCE ON APPLIED PHYSICS AND INDUSTRIAL ENGINEERING 2012, PT C, 2012, 24 : 2227 - 2230
  • [22] Least squares support vector machine regression with boundary condition
    Yan, WW
    Zhang, MG
    Zhang, CK
    Shao, HH
    [J]. PROCEEDINGS OF 2003 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS & SIGNAL PROCESSING, PROCEEDINGS, VOLS 1 AND 2, 2003, : 79 - 81
  • [23] CROSS-VALIDATION AND OPTIMUM RIDGE REGRESSION
    GOLUB, GH
    HEATH, M
    WAHBA, G
    [J]. SIAM REVIEW, 1976, 18 (04) : 806 - 806
  • [24] Fast sparse approximation for least squares support vector machine
    Jiao, Licheng
    Bo, Liefeng
    Wang, Ling
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (03): : 685 - 697
  • [25] Least squares support vector machine based on continuous wavelet kernel
    Wen, XJ
    Cai, Y
    Xu, XM
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 843 - 850
  • [26] Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters
    Yang, Chaoyu
    Yang, Jie
    Ma, Jun
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2020, 13 (01) : 212 - 222
  • [27] Sparse Least Squares Support Vector Machine With Adaptive Kernel Parameters
    Chaoyu Yang
    Jie Yang
    Jun Ma
    [J]. International Journal of Computational Intelligence Systems, 2020, 13 : 212 - 222
  • [28] Least squares support vector machine based on scaling kernel function
    Institute of Neocomputer, Xi'an Jiaotong University, Xi'an 710049, China
    [J]. Moshi Shibie yu Rengong Zhineng, 2006, 5 (598-603):
  • [29] Least squares support vector machine on morlet wavelet kernel function
    Wu, FF
    Zhao, YL
    [J]. PROCEEDINGS OF THE 2005 INTERNATIONAL CONFERENCE ON NEURAL NETWORKS AND BRAIN, VOLS 1-3, 2005, : 327 - 331
  • [30] Optimizing Support Vector regression hyperparameters based on cross-validation
    Ito, K
    Nakano, R
    [J]. PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 2077 - 2082