Variational inference and sparsity in high-dimensional deep Gaussian mixture models

被引:0
|
作者
Lucas Kock
Nadja Klein
David J. Nott
机构
[1] Humboldt-Universität zu Berlin,Chair of Statistics and Data Science; Emmy Noether Research Group
[2] Humboldt-Universität zu Berlin,Emmy Noether research group leader, (DFG, German research foundation), Member of the Jungle Akadmie at the Berlin
[3] National University of Singapore,Brandenburg Academy of science and National Academy of Sciences Leopoldina, Member of the Humboldt Network, Alexander von Humboldt foundation
来源
Statistics and Computing | 2022年 / 32卷
关键词
Deep clustering; High-dimensional clustering; Horseshoe prior; Mixtures of factor analyzers; Natural gradient; Variational approximation;
D O I
暂无
中图分类号
学科分类号
摘要
Gaussian mixture models are a popular tool for model-based clustering, and mixtures of factor analyzers are Gaussian mixture models having parsimonious factor covariance structure for mixture components. There are several recent extensions of mixture of factor analyzers to deep mixtures, where the Gaussian model for the latent factors is replaced by a mixture of factor analyzers. This construction can be iterated to obtain a model with many layers. These deep models are challenging to fit, and we consider Bayesian inference using sparsity priors to further regularize the estimation. A scalable natural gradient variational inference algorithm is developed for fitting the model, and we suggest computationally efficient approaches to the architecture choice using overfitted mixtures where unnecessary components drop out in the estimation. In a number of simulated and two real examples, we demonstrate the versatility of our approach for high-dimensional problems, and demonstrate that the use of sparsity inducing priors can be helpful for obtaining improved clustering results.
引用
收藏
相关论文
共 50 条
  • [1] Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Kock, Lucas
    Klein, Nadja
    Nott, David J.
    [J]. STATISTICS AND COMPUTING, 2022, 32 (05)
  • [2] Correction to : Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Lucas Kock
    Nadja Klein
    David J.Nott
    [J]. Statistics and Computing, 2023, 33
  • [3] Variational inference and sparsity in high-dimensional deep Gaussian mixture models (vol 32, 70, 2022)
    Kock, Lucas
    Klein, Nadja
    Nott, David J.
    [J]. STATISTICS AND COMPUTING, 2023, 33 (01)
  • [4] Uniform inference in high-dimensional Gaussian graphical models
    Klaassen, S.
    Kueck, J.
    Spindler, M.
    Chernozhukov, V
    [J]. BIOMETRIKA, 2023, 110 (01) : 51 - 68
  • [5] Variational Bayesian Inference in High-Dimensional Linear Mixed Models
    Yi, Jieyi
    Tang, Niansheng
    [J]. MATHEMATICS, 2022, 10 (03)
  • [6] Gaussian Variational Approximations for High-dimensional State Space Models
    Quiroz, Matias
    Nott, David J.
    Kohn, Robert
    [J]. BAYESIAN ANALYSIS, 2023, 18 (03): : 989 - 1016
  • [7] Consistent Bayesian sparsity selection for high-dimensional Gaussian DAG models with multiplicative and beta-mixture priors
    Cao, Xuan
    Khare, Kshitij
    Ghosh, Malay
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 2020, 179
  • [8] Regularized Parameter Estimation in High-Dimensional Gaussian Mixture Models
    Ruan, Lingyan
    Yuan, Ming
    Zou, Hui
    [J]. NEURAL COMPUTATION, 2011, 23 (06) : 1605 - 1622
  • [9] Variational Inference of Finite Asymmetric Gaussian Mixture Models
    Song, Ziyang
    Bregu, Ornela
    Ali, Samr
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2448 - 2454
  • [10] Variational Inference of Finite Generalized Gaussian Mixture Models
    Amudala, Srikanth
    Ali, Samr
    Najar, Fatma
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2433 - 2439