Global Convergence of the EM Algorithm for Unconstrained Latent Variable Models with Categorical Indicators

被引:0
|
作者
Alexander Weissman
机构
[1] Law School Admission Council,Psychometric Research
来源
Psychometrika | 2013年 / 78卷
关键词
EM algorithm; latent variable models; latent class models; information theory; Kullback–Leibler divergence; relative entropy; variational calculus; convex optimization; optimal bounds;
D O I
暂无
中图分类号
学科分类号
摘要
Convergence of the expectation-maximization (EM) algorithm to a global optimum of the marginal log likelihood function for unconstrained latent variable models with categorical indicators is presented. The sufficient conditions under which global convergence of the EM algorithm is attainable are provided in an information-theoretic context by interpreting the EM algorithm as alternating minimization of the Kullback–Leibler divergence between two convex sets. It is shown that these conditions are satisfied by an unconstrained latent class model, yielding an optimal bound against which more highly constrained models may be compared.
引用
收藏
页码:134 / 153
页数:19
相关论文
共 50 条