Stochastic Convergences in Divergences

被引:0
|
作者
Kus, Vaclav [1 ]
机构
[1] Czech Tech Univ, Fac Nucl Sci & Phys Engn, Dept Math, Trojanova 13, Prague 12000, Czech Republic
关键词
MINIMUM HELLINGER DISTANCE; CONSISTENT; VARIABLES; NORMALITY; FIT;
D O I
暂无
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The paper presents a new unifying look at the statistical inference. We define score functions and minimum score estimators. We show that the stochastic convergence for estimators, i.e. the consistency of an estimators in score functions, leads to various types of consistency in the well-known statistical distances or disparity measures between probability distributions. We formulate conditions under which a score function is phi-divergence of theoretical and empirical distribution. Conversely, each phi-divergence is a score function. We prove that minimization of arbitrary divergence score function leads to the classical histogram density estimator and that a special score function leads in a similar sense to the minimum Kolmogorov distance estimator.
引用
收藏
页码:135 / 145
页数:11
相关论文
共 50 条