Estimation of kullback-leibler divergence by local likelihood

被引:24
|
作者
Lee, Young Kyung [1 ]
Park, Byeong U. [1 ]
机构
[1] Seoul Natl Univ, Dept Stat, Seoul 151747, South Korea
关键词
kernel smoothing; local likelihood density estimation; bandwidth; Kullback-Leibler divergence;
D O I
10.1007/s10463-005-0014-8
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Motivated from the bandwidth selection problem in local likelihood density estimation and from the problem of assessing a final model chosen by a certain model selection procedure, we consider estimation of the Kullback-Leibler divergence. It is known that the best bandwidth choice for the local likelihood density estimator depends on the distance between the true density and the 'vehicle' parametric model. Also, the Kullback-Leibler divergence may be a useful measure based on which one judges how far the true density is away from a parametric family. We propose two estimators of the Kullback-Leibler divergence. We derive their asymptotic distributions and compare finite sample properties.
引用
收藏
页码:327 / 340
页数:14
相关论文
共 50 条