Parameter estimation by minimizing a probability generating function-based power divergence

被引:2
|
作者
Tay, S. Y. [1 ]
Ng, C. M. [1 ]
Ong, S. H. [1 ,2 ]
机构
[1] Univ Malaya, Inst Math Sci, Kuala Lumpur, Malaysia
[2] UCSI Univ, Dept Actuarial Sci & Appl Stat, Kuala Lumpur, Malaysia
关键词
Density power divergence; Hellinger distance; Jeffreys' divergence; M-estimation; Probability generating function; DISCRETE-DISTRIBUTIONS; DISTANCE; ROBUST;
D O I
10.1080/03610918.2018.1468462
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Generating function-based statistical inference is an attractive approach if the probability (density) function is complicated when compared with the generating function. Here, we propose a parameter estimation method that minimizes a probability generating function (pgf)-based power divergence with a tuning parameter to mitigate the impact of data contamination. The proposed estimator is linked to the M-estimators and hence possesses the properties of consistency and asymptotic normality. In terms of parameter biases and mean squared errors from simulations, the proposed estimation method performs better for smaller value of the tuning parameter as data contamination percentage increases.
引用
收藏
页码:2898 / 2912
页数:15
相关论文
共 50 条