Feature scaling for kernel fisher discriminant analysis using leave-one-out cross validation

被引:9
|
作者
Bo, LF [1 ]
Wang, L [1 ]
Jiao, LC [1 ]
机构
[1] Xidian Univ, Inst Intelligent Informat Proc, Xian 710071, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Kernel fisher discriminant analysis (KFD) is a successful approach to classification. It is well known that the key challenge in KFD lies in the selection of free parameters such as kernel parameters and regularization parameters. Here we focus on the feature-scaling kernel where each feature individually associates with a scaling factor. A novel algorithm, named FS-KFD, is developed to tune the scaling factors and regularization parameters for the feature-scaling kernel. The proposed algorithm is based on optimizing the smooth leave-one-out error via a gradient-descent method and has been demonstrated to be computationally feasible. FS-KFD is motivated by the following two fundamental facts: the leave-one-out error of KFD can be expressed in closed form and the step function can be approximated by a sigmoid function. Empirical comparisons on artificial and benchmark data sets suggest that FS-KFD improves KFD in terms of classification accuracy.
引用
收藏
页码:961 / 978
页数:18
相关论文
共 50 条
  • [1] Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers
    Cawley, GC
    Talbot, NLC
    [J]. PATTERN RECOGNITION, 2003, 36 (11) : 2585 - 2592
  • [2] The leave-one-out kernel
    Tsuda, K
    Kawanabe, M
    [J]. ARTIFICIAL NEURAL NETWORKS - ICANN 2002, 2002, 2415 : 727 - 732
  • [3] Weighted Leave-One-Out Cross Validation
    Pronzato, Luc
    Rendas, Maria-João
    [J]. SIAM-ASA Journal on Uncertainty Quantification, 2024, 12 (04): : 1213 - 1239
  • [4] A leave-one-out cross validation bound for kernel methods with applications in learning
    Zhang, T
    [J]. COMPUTATIONAL LEARNING THEORY, PROCEEDINGS, 2001, 2111 : 427 - 443
  • [5] Efficient approximate leave-one-out cross-validation for kernel logistic regression
    Cawley, Gavin C.
    Talbot, Nicola L. C.
    [J]. MACHINE LEARNING, 2008, 71 (2-3) : 243 - 264
  • [6] Efficient approximate leave-one-out cross-validation for kernel logistic regression
    Gavin C. Cawley
    Nicola L. C. Talbot
    [J]. Machine Learning, 2008, 71 : 243 - 264
  • [7] Approximate Leave-One-Out Cross Validation for Regression With Regularizers
    Auddy, Arnab
    Zou, Haolin
    Rad, Kamiar Rahnama
    Maleki, Arian
    [J]. IEEE Transactions on Information Theory, 2024, 70 (11) : 8040 - 8071
  • [8] EBM PEARL: LEAVE-ONE-OUT (LOO) CROSS VALIDATION
    Hupert, Jordan
    [J]. JOURNAL OF PEDIATRICS, 2020, 220 : 264 - 264
  • [9] Leave-one-out bounds for kernel methods
    Zhang, T
    [J]. NEURAL COMPUTATION, 2003, 15 (06) : 1397 - 1437
  • [10] Optimizing Sparse Kernel Ridge Regression Hyperparameters Based on Leave-One-Out Cross-Validation
    Karasuyama, Masayuki
    Nakano, Ryohei
    [J]. 2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 3463 - 3468