Kernel Stability for Model Selection in Kernel-Based Algorithms

被引:2
|
作者
Liu, Yong [1 ]
Liao, Shizhong [2 ]
Zhang, Hua [1 ]
Ren, Wenqi [1 ]
Wang, Weiping [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Beijing 100093, Peoples R China
[2] Tianjin Univ, Coll Intelligence & Comp, Tianjin 300072, Peoples R China
基金
中国国家自然科学基金;
关键词
Kernel; Stability criteria; Eigenvalues and eigenfunctions; Perturbation methods; Estimation; Support vector machines; AutoML; cross-validation (CV); generalization error; kernel methods; kernel selection; model selection; stability; LEAVE-ONE; REGULARIZATION; MATRIX; BOUNDS; ERROR;
D O I
10.1109/TCYB.2019.2923824
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Model selection is one of the fundamental problems in kernel-based algorithms, which is commonly done by minimizing an estimation of generalization error. The notion of stability and cross-validation (CV) error of learning machines consists of two widely used tools for analyzing the generalization performance. However, there are some disadvantages to both tools when applied for model selection: 1) the stability of learning machines is not practical due to the difficulty of the estimation of its specific value and 2) the CV-based estimate of generalization error usually has a relatively high variance, so it is prone to overfitting. To overcome these two limitations, we present a novel notion of kernel stability (KS) for deriving the generalization error bounds and variance bounds of CV and provide an effective approach to the application of KS for practical model selection. Unlike the existing notions of stability of the learning machine, KS is defined on the kernel matrix; hence, it can avoid the difficulty of the estimation of its value. We manifest the relationship between the KS and the popular uniform stability of the learning algorithm, and further propose several KS-based generalization error bounds and variance bounds of CV. By minimizing the proposed bounds, we present two novel KS-based criteria that can ensure good performance. Finally, we empirically analyze the performance of the proposed criteria on many benchmark data, which demonstrates that our KS-based criteria are sound and effective.
引用
收藏
页码:5647 / 5658
页数:12
相关论文
共 50 条
  • [1] Stability of kernel-based interpolation
    De Marchi, Stefano
    Schaback, Robert
    [J]. ADVANCES IN COMPUTATIONAL MATHEMATICS, 2010, 32 (02) : 155 - 161
  • [2] Stability of kernel-based interpolation
    Stefano De Marchi
    Robert Schaback
    [J]. Advances in Computational Mathematics, 2010, 32 : 155 - 161
  • [3] An introduction to kernel-based learning algorithms
    Müller, KR
    Mika, S
    Rätsch, G
    Tsuda, K
    Schölkopf, B
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2001, 12 (02): : 181 - 201
  • [4] Bandwidth Selection for Kernel-Based Classification
    Lindenbaum, Ofir
    Yeredor, Arie
    Averbuch, Amir
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON THE SCIENCE OF ELECTRICAL ENGINEERING (ICSEE), 2016,
  • [5] A New Model Selection Approach to Hybrid Kernel-Based Estimation
    Scampicchio, Anna
    Pillonetto, Gianluigi
    [J]. 2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 3068 - 3073
  • [6] Distributed Kernel-Based Gradient Descent Algorithms
    Lin, Shao-Bo
    Zhou, Ding-Xuan
    [J]. CONSTRUCTIVE APPROXIMATION, 2018, 47 (02) : 249 - 276
  • [7] Distributed Kernel-Based Gradient Descent Algorithms
    Shao-Bo Lin
    Ding-Xuan Zhou
    [J]. Constructive Approximation, 2018, 47 : 249 - 276
  • [8] EFFICIENT KERNEL-BASED VARIABLE SELECTION WITH SPARSISTENCY
    He, Xin
    Wang, Junhui
    Lv, Shaogao
    [J]. STATISTICA SINICA, 2021, 31 (04) : 2123 - 2151
  • [9] The Characteristics of Kernel and Kernel-based Learning
    Tan, Fuxiao
    Han, Dezhi
    [J]. 2019 3RD INTERNATIONAL SYMPOSIUM ON AUTONOMOUS SYSTEMS (ISAS 2019), 2019, : 406 - 411
  • [10] SUBSET SELECTION FOR KERNEL-BASED SIGNAL RECONSTRUCTION
    Coutino, Mario
    Chepuri, Sundeep Prabhakar
    Leus, Geert
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 4014 - 4018