EXTENDED JACKKNIFE;
WEIGHTED JACKKNIFE;
BOOTSTRAP;
LINEAR AND NONLINEAR REGRESSION;
BIAS ESTIMATES;
VARIANCE ESTIMATES;
MEAN ABSOLUTE ERROR;
MEDIAN ABSOLUTE ERROR;
EXTENDED SAMPLE;
D O I:
10.1080/02331889308802430
中图分类号:
O21 [概率论与数理统计];
C8 [统计学];
学科分类号:
020208 ;
070103 ;
0714 ;
摘要:
Ordinary or weighted jackknife variance or bias estimates may be very inefficient. We show this in the k-sample model, where their risks are k times larger than for the estimates from asymptotic theory, We propose ''extended jackknife estimates'' intended to overcome this possible inefficiency. Indeed in the k-sample model they are identical to the ''asymptotic'' estimates which are also best unbiased and bootstrap estimators. This we show even for general linear models. Under a nonlinear regression model we get a high order asymptotic equivalence between extended jackknife and asymptotic estimates. A considerable small sample improvement over the ordinary or weighted jackknife may be expected, at least for models with a structure near to that of the k-sample problem. The estimation of the mean and the median of the absolute error of a one-dimensional estimator are shortly discussed from the small and the large sample point of view.