ON EXACT L1 RATES OF CONVERGENCE IN NONPARAMETRIC KERNEL REGRESSION

被引:0
|
作者
WAND, MP
机构
关键词
KERNEL ESTIMATOR; MEAN ABSOLUTE ERROR; NONPARAMETRIC REGRESSION; RATES OF CONVERGENCE; WINDOW WIDTH;
D O I
暂无
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The non-parametric estimation of regression functions with a fixed design on the interval [0,1] is considered. Gasser & Muller (1979, 1984) introduced a class of kernel estimators for this problem and derived optimal rates of convergence of the estimator with respect to mean squared error and integrated mean squared error. Alternative measures of loss are those based on the L1 metric. These have simple intuitive interpretations such as the "area between the two curves" for global estimation and the "absolute distance between the two points" for local estimation. In this note we derive optimal rates of convergence for L1-based measures of loss: mean absolute error and integrated mean absolute error. We demonstrate that there is little difference between L1-optimality and L2-optimality for non-parametric kernel regression.
引用
收藏
页码:251 / 256
页数:6
相关论文
共 50 条