MINIMUM DISCRIMINATION INFORMATION ESTIMATOR OF THE MEAN WITH KNOWN COEFFICIENT OF VARIATION

被引:5
|
作者
SOOFI, ES
GOKHALE, DV
机构
[1] UNIV WISCONSIN,SCH BUSINESS ADM,MILWAUKEE,WI 53201
[2] UNIV CALIF RIVERSIDE,DEPT STAT,RIVERSIDE,CA 92502
关键词
ENTROPY; LOCATION-SCALE FAMILY; SEMIPARAMETRIC; NORMAL; LOGISTIC; DOUBLE EXPONENTIAL;
D O I
10.1016/0167-9473(91)90067-C
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Estimation of a mean when the coefficient of variation is known is treated as a constrained optimization of the Kullback-Leibler distrimination information function. The procedure is semi-parametric in that it uses a normal distribution as the maximum entropy model to develop an estimator for the mean of all distributions in the location-scale family with finite entropy. The performance of the proposed procedure is compared with some existing methods in a bootstrap study of data from a laboratory experiment. Monte Carlo simulations are also used to compare the mean squared error of several estimators under various distribution assumptions and over a wide range of values for the coefficient of variation and sample size. The new estimator shows smaller mean squared error than its nonprametric counterpart and it competes well with its parametric counterparts.
引用
收藏
页码:165 / 177
页数:13
相关论文
共 50 条