Genetic-based EM algorithm for learning Gaussian mixture models

被引:183
|
作者
Pernkopf, F
Bouchaffra, D
机构
[1] Univ Washington, Dept Elect Engn, Seattle, WA 98195 USA
[2] Graz Univ Technol, Lab Signal Proc & Speech Commun, A-8010 Graz, Austria
[3] Oakland Univ, Dept Comp Sci & Engn, Rochester, MI 48309 USA
关键词
unsupervised learning; clustering; Gaussian mixture models; EM algorithm; genetic algorithm; minimum description length;
D O I
10.1109/TPAMI.2005.162
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a genetic-based expectation-maximization (GA-EM) algorithm for learning Gaussian mixture models from multivariate data. This algorithm is capable of selecting the number of components of the model using the minimum description length (MDL) criterion. Our approach benefits from the properties of Genetic algorithms (GA) and the EM algorithm by combination of both into a single procedure. The population-based stochastic search of the GA explores the search space more thoroughly than the EM method. Therefore, our algorithm enables escaping from local optimal solutions since the algorithm becomes less sensitive to its initialization. The GA-EM algorithm is elitist which maintains the monotonic convergence property of the EM algorithm. The experiments on simulated and real data show that the GA-EM outperforms the EM method since: 1) We have obtained a better MDL score while using exactly the same termination condition for both algorithms. 2) Our approach identifies the number of components which were used to generate the underlying data more often than the EM algorithm.
引用
收藏
页码:1344 / 1348
页数:5
相关论文
共 50 条
  • [1] Genetic-based EM algorithm to improve the robustness of Gaussian mixture models for damage detection in bridges
    Santos, Adam
    Figueiredo, Eloi
    Silva, Moises
    Santos, Reginaldo
    Sales, Claudomiro
    Costa, Joao C. W. A.
    [J]. STRUCTURAL CONTROL & HEALTH MONITORING, 2017, 24 (03):
  • [2] On Semi-Supervised Learning Genetic-Based and Deterministic Annealing EM Algorithm for Dirichlet Mixture Models
    Bai JingHua
    Li Kan
    Zhang XiaoXian
    [J]. QUANTUM, NANO, MICRO AND INFORMATION TECHNOLOGIES, 2011, 39 : 151 - +
  • [3] An EM Algorithm for Singular Gaussian Mixture Models
    Masmoudi, Khalil
    Masmoudi, Afif
    [J]. FILOMAT, 2019, 33 (15) : 4753 - 4767
  • [4] Learning mixture models using a genetic version of the EM algorithm
    Martínez, AM
    Vitrià, J
    [J]. PATTERN RECOGNITION LETTERS, 2000, 21 (08) : 759 - 769
  • [5] Using a Genetic Algorithm for Selection of Starting Conditions for the EM Algorithm for Gaussian Mixture Models
    Kwedlo, Wojciech
    [J]. PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON COMPUTER RECOGNITION SYSTEMS, CORES 2015, 2016, 403 : 125 - 134
  • [6] A greedy EM algorithm for Gaussian mixture learning
    Vlassis, N
    Likas, A
    [J]. NEURAL PROCESSING LETTERS, 2002, 15 (01) : 77 - 87
  • [7] A Greedy EM Algorithm for Gaussian Mixture Learning
    Nikos Vlassis
    Aristidis Likas
    [J]. Neural Processing Letters, 2002, 15 : 77 - 87
  • [8] EBEM:: An entropy-based EM algorithm for Gaussian mixture models
    Benavent, Antonio Penalver
    Ruiz, Francisco Escolano
    Martinez, Juan M. Ssez
    [J]. 18TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 2, PROCEEDINGS, 2006, : 451 - +
  • [9] A robust EM clustering algorithm for Gaussian mixture models
    Yang, Miin-Shen
    Lai, Chien-Yo
    Lin, Chih-Ying
    [J]. PATTERN RECOGNITION, 2012, 45 (11) : 3950 - 3961
  • [10] Random swap EM algorithm for Gaussian mixture models
    Zhao, Qinpei
    Hautamaki, Ville
    Karkkainen, Ismo
    Franti, Pasi
    [J]. PATTERN RECOGNITION LETTERS, 2012, 33 (16) : 2120 - 2126