k-means as a variational EM approximation of Gaussian mixture models

被引:34
|
作者
Luecke, Joerg [1 ]
Forster, Dennis [1 ]
机构
[1] Carl von Ossietzky Univ Oldenburg, Machine Learning Lab, Ammerlander Heerstr 114-118, D-26129 Oldenburg, Germany
关键词
k-means; Gaussian mixture models; Expectation maximization; Variational methods; Free energy; ALGORITHM;
D O I
10.1016/j.patrec.2019.04.001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We show that k-means (Lloyd's algorithm) is obtained as a special case when truncated variational EM approximations are applied to Gaussian mixture models (GMM) with isotropic Gaussians. In contrast to the standard way to relate k-means and GMMs, the provided derivation shows that it is not required to consider Gaussians with small variances or the limit case of zero variances. There are a number of consequences that directly follow from our approach: (A) k-means can be shown to increase a free energy (a.k.a. ELBO) associated with truncated distributions and this free energy can directly be reformulated in terms of the k-means objective; (B) k-means generalizations can directly be derived by considering the 2nd closest, 3rd closest etc. cluster in addition to just the closest one; and (C) the embedding of k-means into a free energy framework allows for theoretical interpretations of other k-means generalizations in the literature. In general, truncated variational EM provides a natural and rigorous quantitative link between k-means-like clustering and GMM clustering algorithms which may be very relevant for future theoretical and empirical studies. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:349 / 356
页数:8
相关论文
共 50 条
  • [21] Statistical convergence of the EM algorithm on Gaussian mixture models
    Zhao, Ruofei
    Li, Yuanzhi
    Sun, Yuekai
    [J]. ELECTRONIC JOURNAL OF STATISTICS, 2020, 14 (01): : 632 - 660
  • [22] MAP approximation to the variational Bayes Gaussian mixture model and application
    Kart-Leong Lim
    Han Wang
    [J]. Soft Computing, 2018, 22 : 3287 - 3299
  • [23] MAP approximation to the variational Bayes Gaussian mixture model and application
    Lim, Kart-Leong
    Wang, Han
    [J]. SOFT COMPUTING, 2018, 22 (10) : 3287 - 3299
  • [24] Predictive K-means with Local Models
    Lemaire, Vincent
    Ismaili, Oumaima Alaoui
    Cornuejols, Antoine
    Gay, Dominique
    [J]. TRENDS AND APPLICATIONS IN KNOWLEDGE DISCOVERY AND DATA MINING, 2020, 12237 : 91 - 103
  • [25] Comparative Study of K-means, Gaussian Mixture Model, Fuzzy C-means algorithms for Brain Tumor Segmentation
    Baid, U.
    Talbar, S.
    Talbar, S.
    [J]. PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMMUNICATION AND SIGNAL PROCESSING 2016 (ICCASP 2016), 2017, 137 : 592 - 597
  • [26] A Unified Formulation of k-Means, Fuzzy c-Means and Gaussian Mixture Model by the Kolmogorov-Nagumo Average
    Komori, Osamu
    Eguchi, Shinto
    [J]. ENTROPY, 2021, 23 (05)
  • [27] K-means for shared frailty models
    Usha Govindarajulu
    Sandeep Bedi
    [J]. BMC Medical Research Methodology, 22
  • [28] K-means for shared frailty models
    Govindarajulu, Usha
    Bedi, Sandeep
    [J]. BMC MEDICAL RESEARCH METHODOLOGY, 2022, 22 (01)
  • [29] Faster Mahalanobis K-Means Clustering for Gaussian Distributions
    Chokniwal, Ankita
    Singh, Manoj
    [J]. 2016 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATIONS AND INFORMATICS (ICACCI), 2016, : 947 - 952
  • [30] Variational Inference of Finite Asymmetric Gaussian Mixture Models
    Song, Ziyang
    Bregu, Ornela
    Ali, Samr
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2448 - 2454