Alternative EM methods for nonparametric finite mixture models

被引:36
|
作者
Pilla, RS
Lindsay, BG
机构
[1] Univ Illinois, Div Epidemiol & Biostat, Chicago, IL 60612 USA
[2] Penn State Univ, Dept Stat, University Pk, PA 16802 USA
基金
美国国家科学基金会; 美国国家卫生研究院;
关键词
augmentation; complete data; EM algorithm; finite mixture distribution; high-dimensional; maximum likelihood; missing data; rate of convergence; nonparametric mixture; zero-elimination;
D O I
10.1093/biomet/88.2.535
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
This research focuses on a general class of maximum likelihood problems in which it is desired to maximise a nonparametric mixture likelihood with finitely many known component densities over the set of unknown weight parameters. Convergence of the conventional EM algorithm for this problem is extremely slow when the component densities are poorly separated and when the maximum likelihood estimator requires some of the weights to be zero, as the algorithm can never reach such a boundary point. Alternative methods based on the principles of EM are developed using a two-stage approach. First, a new data augmentation scheme provides improved convergence rates in certain parameter directions. Secondly, two 'cyclic versions' of this data augmentation are created by changing the missing data formulation between the EM-steps; these extend the acceleration directions to the whole parameter space, giving another order of magnitude increase in convergence rate. Examples indicate that the new cyclic versions of the data augmentation schemes can converge up to 500 times faster than the conventional EM algorithm for fitting nonparametric finite mixture models.
引用
收藏
页码:535 / 550
页数:16
相关论文
共 50 条
  • [31] A nonparametric estimation method for the multivariate mixture models
    Lu, Nan
    Wang, Lihong
    [J]. JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2022, 92 (17) : 3727 - 3742
  • [32] Nonparametric latency estimation for mixture cure models
    Ana López-Cheda
    M. Amalia Jácome
    Ricardo Cao
    [J]. TEST, 2017, 26 : 353 - 376
  • [33] An empirical classification procedure for nonparametric mixture models
    Zhao, Qiang
    Karunamuni, Rohana J.
    Wu, Jingjing
    [J]. JOURNAL OF THE KOREAN STATISTICAL SOCIETY, 2020, 49 (03) : 924 - 952
  • [34] Incremental learning of nonparametric Bayesian mixture models
    Gomes, Ryan
    Welling, Max
    Perona, Pietro
    [J]. 2008 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-12, 2008, : 227 - +
  • [35] AN OPERATOR THEORETIC APPROACH TO NONPARAMETRIC MIXTURE MODELS
    Vandermeulen, Robert A.
    Scott, Clayton D.
    [J]. ANNALS OF STATISTICS, 2019, 47 (05): : 2704 - 2733
  • [36] Nonparametric latency estimation for mixture cure models
    Lopez-Cheda, Ana
    Amalia Jacome, M.
    Cao, Ricardo
    [J]. TEST, 2017, 26 (02) : 353 - 376
  • [37] An empirical classification procedure for nonparametric mixture models
    Qiang Zhao
    Rohana J. Karunamuni
    Jingjing Wu
    [J]. Journal of the Korean Statistical Society, 2020, 49 : 924 - 952
  • [38] An Overview of the New Feature Selection Methods in Finite Mixture of Regression Models
    Khalili, Abbas
    [J]. JIRSS-JOURNAL OF THE IRANIAN STATISTICAL SOCIETY, 2011, 10 (02): : 201 - 235
  • [39] Nonparametric Finite Mixture: Applications in Overcoming Misclassification Bias
    Ye, Zi
    Harrar, Solomon W.
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2023,
  • [40] An EM Algorithm for Singular Gaussian Mixture Models
    Masmoudi, Khalil
    Masmoudi, Afif
    [J]. FILOMAT, 2019, 33 (15) : 4753 - 4767