Learning rates of least-square regularized regression with polynomial kernels

被引:4
|
作者
Li BingZheng [1 ]
Wang GuoMao [1 ]
机构
[1] Zhejiang Univ, Dept Math, Hangzhou 310027, Zhejiang, Peoples R China
来源
SCIENCE IN CHINA SERIES A-MATHEMATICS | 2009年 / 52卷 / 04期
关键词
learning theory; reproducing kernel Hilbert space; polynomial kernel; regularization error; Bernstein-Durrmeyer operators; covering number; regularization scheme; APPROXIMATION;
D O I
10.1007/s11425-008-0137-5
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper presents learning rates for the least-square regularized regression algorithms with polynomial kernels. The target is the error analysis for the regression problem in learning theory. A regularization scheme is given, which yields sharp learning rates. The rates depend on the dimension of polynomial space and polynomial reproducing kernel Hilbert space measured by covering numbers. Meanwhile, we also establish the direct approximation theorem by Bernstein-Durrmeyer operators in L-rho X(2) with Borel probability measure.
引用
收藏
页码:687 / 700
页数:14
相关论文
共 50 条