Variational learning for Gaussian mixture models

被引:110
|
作者
Nasios, Nikolaos [1 ]
Bors, Adrian G. [1 ]
机构
[1] Univ York, Dept Comp Sci, York YO10 5DD, N Yorkshire, England
关键词
Bayesian inference; expectation-maximization algorithm; Gaussian mixtures; maximum log-likelihood estimation; variational training;
D O I
10.1109/TSMCB.2006.872273
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper proposes a joint maximum likelihood and Bayesian methodology for estimating Gaussian mixture models. In Bayesian inference, the distributions of parameters are modeled, characterized by hyperparameters. In the case of Gaussian mixtures, the distributions of parameters are considered as Gaussian for the mean, Wishart for the covariance, and Dirichlet for the mixing probability. The learning task consists of estimating the hyperparameters characterizing these distributions. The integration in the parameter space is decoupled using an unsupervised variational methodology entitled variational expectation-maximization (VEM). This paper introduces a hyperparameter initialization procedure for the training algorithm. In the first stage, distributions of parameters resulting from successive runs of the expectation-maximization algorithm are formed. Afterward, maximum-likelihood estimators are applied to find appropriate initial values for the hyperparameters. The proposed initialization provides faster convergence, more accurate hyperparameter estimates, and better generalization for the VEM training algorithm. The proposed methodology is applied in blind signal detection and in color image segmentation.
引用
收藏
页码:849 / 862
页数:14
相关论文
共 50 条
  • [1] Lower bounds of Stochastic complexities in variational Bayes learning of Gaussian mixture models
    Watanabe, K
    Watanabe, S
    [J]. 2004 IEEE CONFERENCE ON CYBERNETICS AND INTELLIGENT SYSTEMS, VOLS 1 AND 2, 2004, : 99 - 104
  • [2] Brain MRI image segmentation based on learning local variational Gaussian mixture models
    Xia, Yong
    Ji, Zexuan
    Zhang, Yanning
    [J]. NEUROCOMPUTING, 2016, 204 : 189 - 197
  • [3] GPGPU Implementation of Variational Bayesian Gaussian Mixture Models
    Nishimoto, Hiroki
    Zhang, Renyuan
    Nakashima, Yasuhiko
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (03) : 611 - 622
  • [4] Variational Inference of Finite Asymmetric Gaussian Mixture Models
    Song, Ziyang
    Bregu, Ornela
    Ali, Samr
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2448 - 2454
  • [5] Variational Gaussian Mixture Models for Speech Emotion Recognition
    Mishra, Harendra Kumar
    Sekhar, C. Chandra
    [J]. ICAPR 2009: SEVENTH INTERNATIONAL CONFERENCE ON ADVANCES IN PATTERN RECOGNITION, PROCEEDINGS, 2009, : 183 - 186
  • [6] Variational Inference of Finite Generalized Gaussian Mixture Models
    Amudala, Srikanth
    Ali, Samr
    Najar, Fatma
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2433 - 2439
  • [7] GPGPU Implementation of Variational Bayesian Gaussian Mixture Models
    Nishimoto, Hiroki
    Nakada, Takashi
    Nakashima, Yasuhiko
    [J]. 2019 SEVENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING (CANDAR 2019), 2019, : 185 - 190
  • [8] Variational bayesian feature selection for Gaussian mixture models
    Valente, F
    Wellekens, C
    [J]. 2004 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOL I, PROCEEDINGS: SPEECH PROCESSING, 2004, : 513 - 516
  • [9] Trust-region variational inference with gaussian mixture models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. Journal of Machine Learning Research, 2020, 21
  • [10] Trust-Region Variational Inference with Gaussian Mixture Models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21