An Upper Bound of the Bias of Nadaraya-Watson Kernel Regression under Lipschitz Assumptions

被引:2
|
作者
Tosatto, Samuele [1 ]
Akrour, Riad [1 ]
Peters, Jan [1 ,2 ]
机构
[1] Tech Univ Darmstadt, Comp Sci Dept, D-64289 Darmstadt, Germany
[2] Max Planck Inst Intelligent Syst, Comp Sci Dept, D-70569 Stuttgart, Germany
来源
STATS | 2021年 / 4卷 / 01期
关键词
nonparametric regression; Nadaraya-Watson kernel regression; bias; BANDWIDTH SELECTION;
D O I
10.3390/stats4010001
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The Nadaraya-Watson kernel estimator is among the most popular nonparameteric regression technique thanks to its simplicity. Its asymptotic bias has been studied by Rosenblatt in 1969 and has been reported in several related literature. However, given its asymptotic nature, it gives no access to a hard bound. The increasing popularity of predictive tools for automated decision-making surges the need for hard (non-probabilistic) guarantees. To alleviate this issue, we propose an upper bound of the bias which holds for finite bandwidths using Lipschitz assumptions and mitigating some of the prerequisites of Rosenblatt's analysis. Our bound has potential applications in fields like surgical robots or self-driving cars, where some hard guarantees on the prediction-error are needed.
引用
收藏
页码:1 / 17
页数:17
相关论文
共 27 条