Asymptotic normality of an adaptive kernel density estimator for finite mixture models

被引:3
|
作者
Karunamuni, RJ
Sriram, TN [1 ]
Wu, J
机构
[1] Univ Georgia, Dept Stat, Athens, GA 30602 USA
[2] Univ Alberta, Dept Math & Stat Sci, Edmonton, AB T6G 2G1, Canada
关键词
adaptive kernel density estimator; minimum Hellinger distance estimation; asymptotic normality;
D O I
10.1016/j.spl.2005.07.017
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Choice of an appropriate kernel density estimator is a difficult one in minimum distance estimation based on density functions. Particularly, for mixture models, the choice of bandwidth is very crucial because the component densities may have different scale parameters, which in turn necessitate varying amount of smoothing. Adaptive kernel density estimators use different bandwidths for different components, which make them an ideal choice for minimum distance estimation in mixture models. Cutler and Cordero-Brana (1996. Minimum Hellinger distance estimates for parametric models. J. Amer. Stat. Assoc. 91, 1716-1721) introduced such an adaptive kernel density estimator in their work on minimum Hellinger distance estimation of mixture parameters. In this paper, we study a general version of their adaptive kernel density estimator and establish the asymptotic normality of the proposed estimator. We also illustrate the performance of our estimator via a small simulation study. (C) 2005 Elsevier B.V. All rights reserved.
引用
收藏
页码:211 / 220
页数:10
相关论文
共 50 条