Learning rates for regularized classifiers using multivariate polynomial kernels

被引:22
|
作者
Tong, Hongzhi [1 ,2 ]
Chen, Di-Rong [3 ,4 ]
Peng, Lizhong [1 ]
机构
[1] Peking Univ, Sch Math Sci, LMAM, Beijing 100871, Peoples R China
[2] Univ Int Business & Econ, Dept Math, Beijing 100029, Peoples R China
[3] Beijing Univ Aeronaut & Astronaut, Dept Math, Beijing 100083, Peoples R China
[4] Beijing Univ Aeronaut & Astronaut, LMIB, Beijing 100083, Peoples R China
基金
中国国家自然科学基金;
关键词
Regularized classifiers; Bernstein-Durrmeyer polynomials; Reproducing kernel Hilbert spaces; Polynomial kernels; Learning rates;
D O I
10.1016/j.jco.2008.05.008
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Regularized classifiers (a leading example is support vector machine) are known to be a kind of kernel-based classification methods generated from Tikhonov regularization schemes, and the polynomial kernels are the original and also probably the most important kernels used in them. In this paper, we provide an error analysis for the regularized classifiers using multivariate polynomial kernels. We introduce Bernstein-Durrmeyer polynomials, whose reproducing kernel Hilbert space norms and approximation properties in L-1 space play a key role in the analysis of regularization error. We also introduce the standard estimation of sample error, and derive explicit learning rates for these algorithms. (C) 2008 Elsevier Inc. All rights reserved.
引用
收藏
页码:619 / 631
页数:13
相关论文
共 50 条