Optimizing Sparse Kernel Ridge Regression Hyperparameters Based on Leave-One-Out Cross-Validation

被引:0
|
作者
Karasuyama, Masayuki [1 ]
Nakano, Ryohei [1 ]
机构
[1] Nagoya Inst Technol, Dept Comp Sci & Engn, Nagoya, Aichi 4668555, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel Ridge Regression (KRR) is a nonlinear extension of the ridge regression. The performance of the KRR depends on its hyperparameters such as a penalty factor C, and RBF kernel parameter or. We employ a method called MCV-KRR which optimizes the KRR hyperparameters so that a cross-validation error is minimized. This method becomes equivalent to a predictive approach to Gaussian Process. Since the cost of KRR training is O(N-3) where N is a data size, to reduce this complexity, some sparse approximation of the KRR is recently studied. In this paper, we apply the minimum cross-validation (MCV) approach to such sparse approximation. Our experiments show the MCV with the sparse approximation of the KRR can achieve almost the same generalization performance as the MCV-KRR with much lower cost.
引用
收藏
页码:3463 / 3468
页数:6
相关论文
共 50 条
  • [41] Feature scaling for kernel fisher discriminant analysis using leave-one-out cross validation
    Bo, LF
    Wang, L
    Jiao, LC
    [J]. NEURAL COMPUTATION, 2006, 18 (04) : 961 - 978
  • [42] EBM PEARL: LEAVE-ONE-OUT (LOO) CROSS VALIDATION
    Hupert, Jordan
    [J]. JOURNAL OF PEDIATRICS, 2020, 220 : 264 - 264
  • [43] Leave-one-out bounds for kernel methods
    Zhang, T
    [J]. NEURAL COMPUTATION, 2003, 15 (06) : 1397 - 1437
  • [44] Fault diagnosis model based on least square support vector machine optimized by leave-one-out cross-validation
    Li, Feng
    Tang, Bao-Ping
    Zhang, Guo-Wen
    [J]. Zhendong yu Chongji/Journal of Vibration and Shock, 2010, 29 (09): : 170 - 174
  • [45] Unbiased estimator for the variance of the leave-one-out cross-validation estimator for a Bayesian normal model with fixed variance
    Sivula, Tuomas
    Magnusson, Mans
    Vehtari, Aki
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2023, 52 (16) : 5877 - 5899
  • [46] Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization
    Chen, S
    Hong, X
    Harris, CJ
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (04): : 1708 - 1717
  • [47] Adaptive network-based fuzzy inference system with leave-one-out cross-validation approach for prediction of surface roughness
    Dong, Minggang
    Wang, Ning
    [J]. APPLIED MATHEMATICAL MODELLING, 2011, 35 (03) : 1024 - 1035
  • [48] Cross-validation of best linear unbiased predictions of breeding values using an efficient leave-one-out strategy
    Cheng, Jian
    Dekkers, Jack C. M.
    Fernando, Rohan L.
    [J]. JOURNAL OF ANIMAL BREEDING AND GENETICS, 2021, 138 (05) : 519 - 527
  • [49] Learning to predict the leave-one-out error of kernel based classifiers
    Tsuda, K
    Rätsch, G
    Mika, S
    Müller, KR
    [J]. ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS, 2001, 2130 : 331 - 338
  • [50] Leave-One-Out Kernel Optimization for Shadow Detection
    Vicente, Tomas F. Yago
    Hoai, Minh
    Samaras, Dimitris
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 3388 - 3396