Efficient approximate leave-one-out cross-validation for kernel logistic regression

被引:0
|
作者
Gavin C. Cawley
Nicola L. C. Talbot
机构
[1] University of East Anglia,School of Computing Sciences
来源
Machine Learning | 2008年 / 71卷
关键词
Model selection; Kernel logistic regression;
D O I
暂无
中图分类号
学科分类号
摘要
Kernel logistic regression (KLR) is the kernel learning method best suited to binary pattern recognition problems where estimates of a-posteriori probability of class membership are required. Such problems occur frequently in practical applications, for instance because the operational prior class probabilities or equivalently the relative misclassification costs are variable or unknown at the time of training the model. The model parameters are given by the solution of a convex optimization problem, which may be found via an efficient iteratively re-weighted least squares (IRWLS) procedure. The generalization properties of a kernel logistic regression machine are however governed by a small number of hyper-parameters, the values of which must be determined during the process of model selection. In this paper, we propose a novel model selection strategy for KLR, based on a computationally efficient closed-form approximation of the leave-one-out cross-validation procedure. Results obtained on a variety of synthetic and real-world benchmark datasets are given, demonstrating that the proposed model selection procedure is competitive with a more conventional k-fold cross-validation based approach and also with Gaussian process (GP) classifiers implemented using the Laplace approximation and via the Expectation Propagation (EP) algorithm.
引用
收藏
页码:243 / 264
页数:21
相关论文
共 50 条
  • [1] Efficient approximate leave-one-out cross-validation for kernel logistic regression
    Cawley, Gavin C.
    Talbot, Nicola L. C.
    [J]. MACHINE LEARNING, 2008, 71 (2-3) : 243 - 264
  • [2] Efficient approximate k-fold and leave-one-out cross-validation for ridge regression
    Meijer, Rosa J.
    Goeman, Jelle J.
    [J]. BIOMETRICAL JOURNAL, 2013, 55 (02) : 141 - 155
  • [3] Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers
    Cawley, GC
    Talbot, NLC
    [J]. PATTERN RECOGNITION, 2003, 36 (11) : 2585 - 2592
  • [4] Approximate Leave-One-Out Cross Validation for Regression With Regularizers
    Auddy, Arnab
    Zou, Haolin
    Rad, Kamiar Rahnama
    Maleki, Arian
    [J]. IEEE Transactions on Information Theory, 2024, 70 (11) : 8040 - 8071
  • [5] Optimizing Sparse Kernel Ridge Regression Hyperparameters Based on Leave-One-Out Cross-Validation
    Karasuyama, Masayuki
    Nakano, Ryohei
    [J]. 2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 3463 - 3468
  • [6] Dichotomous logistic regression with leave-one-out validation
    Teh, Sin Yin
    Othman, Abdul Rahman
    Khoo, Michael Boon Chong
    [J]. World Academy of Science, Engineering and Technology, 2010, 62 : 1001 - 1010
  • [7] Leave-one-out cross-validation is risk consistent for lasso
    Darren Homrighausen
    Daniel J. McDonald
    [J]. Machine Learning, 2014, 97 : 65 - 78
  • [8] Leave-one-out cross-validation is risk consistent for lasso
    Homrighausen, Darren
    McDonald, Daniel J.
    [J]. MACHINE LEARNING, 2014, 97 (1-2) : 65 - 78
  • [9] Bayesian Leave-One-Out Cross-Validation for Large Data
    Magnusson, Mans
    Andersen, Michael Riis
    Jonasson, Johan
    Vehtari, Aki
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [10] Approximate Leave-one-out Cross Validation for Regression with l1 Regularizers
    Auddy, Arnab
    Zou, Haolin
    Rad, Kamiar Rahnama
    Maleki, Arian
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238