Optimizing Sparse Kernel Ridge Regression Hyperparameters Based on Leave-One-Out Cross-Validation

被引:0
|
作者
Karasuyama, Masayuki [1 ]
Nakano, Ryohei [1 ]
机构
[1] Nagoya Inst Technol, Dept Comp Sci & Engn, Nagoya, Aichi 4668555, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel Ridge Regression (KRR) is a nonlinear extension of the ridge regression. The performance of the KRR depends on its hyperparameters such as a penalty factor C, and RBF kernel parameter or. We employ a method called MCV-KRR which optimizes the KRR hyperparameters so that a cross-validation error is minimized. This method becomes equivalent to a predictive approach to Gaussian Process. Since the cost of KRR training is O(N-3) where N is a data size, to reduce this complexity, some sparse approximation of the KRR is recently studied. In this paper, we apply the minimum cross-validation (MCV) approach to such sparse approximation. Our experiments show the MCV with the sparse approximation of the KRR can achieve almost the same generalization performance as the MCV-KRR with much lower cost.
引用
收藏
页码:3463 / 3468
页数:6
相关论文
共 50 条
  • [1] Efficient approximate leave-one-out cross-validation for kernel logistic regression
    Cawley, Gavin C.
    Talbot, Nicola L. C.
    [J]. MACHINE LEARNING, 2008, 71 (2-3) : 243 - 264
  • [2] Efficient approximate leave-one-out cross-validation for kernel logistic regression
    Gavin C. Cawley
    Nicola L. C. Talbot
    [J]. Machine Learning, 2008, 71 : 243 - 264
  • [3] Efficient approximate k-fold and leave-one-out cross-validation for ridge regression
    Meijer, Rosa J.
    Goeman, Jelle J.
    [J]. BIOMETRICAL JOURNAL, 2013, 55 (02) : 141 - 155
  • [4] Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers
    Cawley, GC
    Talbot, NLC
    [J]. PATTERN RECOGNITION, 2003, 36 (11) : 2585 - 2592
  • [5] Online modeling of kernel extreme learning machine based on fast leave-one-out cross-validation
    [J]. Zhang, Y.-T. (zyt01@mails.tsinghua.edu.cn), 1600, Shanghai Jiaotong University (48):
  • [6] Leave-one-out cross-validation is risk consistent for lasso
    Darren Homrighausen
    Daniel J. McDonald
    [J]. Machine Learning, 2014, 97 : 65 - 78
  • [7] Leave-one-out cross-validation is risk consistent for lasso
    Homrighausen, Darren
    McDonald, Daniel J.
    [J]. MACHINE LEARNING, 2014, 97 (1-2) : 65 - 78
  • [8] Bayesian Leave-One-Out Cross-Validation for Large Data
    Magnusson, Mans
    Andersen, Michael Riis
    Jonasson, Johan
    Vehtari, Aki
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [9] Kriging Model Averaging Based on Leave-One-Out Cross-Validation Method
    Feng, Ziheng
    Zong, Xianpeng
    Xie, Tianfa
    Zhang, Xinyu
    [J]. JOURNAL OF SYSTEMS SCIENCE & COMPLEXITY, 2024, 37 (05) : 2132 - 2156
  • [10] Kriging Model Averaging Based on Leave-One-Out Cross-Validation Method
    FENG Ziheng
    ZONG Xianpeng
    XIE Tianfa
    ZHANG Xinyu
    [J]. JournalofSystemsScience&Complexity., 2024, 37 (05) - 2156