Approximating posterior distributions for mixture-model parameters

被引:0
|
作者
Center, JL [1 ]
机构
[1] Creat Res Corp, Andover, MA 01810 USA
关键词
D O I
暂无
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We consider a fully-Bayesian approach to using mixture models in pattern classification problems. From a Bayesian point of view, classification should be done by integrating over the posterior distribution for the model parameters (including the number of components in the mixture) given previous observations. Although it is relatively easy to compute the value of the joint probability density of the observed data and a particular choice of model parameters, it is usually difficult to integrate over the whole distribution because of the high dimensionality of the parameter space. Most current methods using mixture models settle for finding a mode of the posterior distribution using the Expectation Maximization (EM) algorithm. But, much more can be learned about the posterior distribution. We explore the multi-state MCMC methods introduced by Skilling and show how these methods can be applied to Gaussian mixture models. In addition, we examine genetic algorithms, which are most often used as optimization algorithms. We show how these algorithms can be adapted to act as multi-state MCMC algorithms, as suggested by MacKay.
引用
收藏
页码:437 / 444
页数:8
相关论文
共 50 条