On difference-based gradient estimation in nonparametric regression

被引:1
|
作者
Zhang, Maoyu [1 ]
Dai, Wenlin [1 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Ctr Appl Stat, Beijing, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
bias correction; difference sequence; gradient estimation; optimal convergence rate; plug-in bandwidth; DERIVATIVE ESTIMATION; CONFIDENCE BANDS; CHOICE;
D O I
10.1002/sam.11644
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a framework to directly estimate the gradient in multivariate nonparametric regression models that bypasses fitting the regression function. Specifically, we construct the estimator as a linear combination of adjacent observations with the coefficients from a vector-valued difference sequence, so it is more flexible than existing methods. Under the equidistant designs, closed-form solutions of the optimal sequences are derived by minimizing the estimation variance, with the estimation bias well controlled. We derive the theoretical properties of the estimators and show that they achieve the optimal convergence rate. Further, we propose a data-driven tuning parameter-selection criterion for practical implementation. The effectiveness of our estimators is validated via simulation studies and a real data application.
引用
收藏
页数:14
相关论文
共 50 条