Mixture Models, Bayes Fisher Information, and Divergence Measures

被引:22
|
作者
Asadi, Majid [1 ,2 ]
Ebrahimi, Nader [3 ]
Kharazmi, Omid [1 ]
Soofi, Ehsan S. [4 ]
机构
[1] Univ Isfahan, Dept Stat, Esfahan 81744, Iran
[2] Inst Res Fundamental Sci IPM, Sch Math, Tehran 193955746, Iran
[3] Northern Illinois Univ, Div Stat, De Kalb, IL 60155 USA
[4] Univ Wisconsin, Lubar Sch Business, Milwaukee, WI 53201 USA
关键词
Fisher information; Jeffreys divergence; Jensen-Shannon divergence; Kullback-Leibler divergence; Renyi divergence; Tsallis divergence;
D O I
10.1109/TIT.2018.2877608
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper presents the Bayes Fisher information measures, defined by the expected Fisher information under a distribution for the parameter, for the arithmetic, geometric, and generalized mixtures of two probability density functions. The Fisher information of the arithmetic mixture about the mixing parameter is related to chi-square divergence, Shannon entropy, and the Jensen-Shannon divergence. The Bayes Fisher measures of the three mixture models are related to the Kullback-Leibler, Jeffreys, Jensen-Shannon, Renyi, and Tsallis divergences. These measures indicate that the farther away are the components from each other, the more informative are data about the mixing parameter. We also unify three different relative entropy derivations of the geometric mixture scattered in statistics and physics literatures. Extensions of two of the formulations to the minimization of Tsallis divergence give the generalized mixture as the solution.
引用
收藏
页码:2316 / 2321
页数:6
相关论文
共 50 条