A fully Bayesian model based on reversible jump MCMC and finite Beta mixtures for clustering

被引:32
|
作者
Bouguila, Nizar [1 ]
Elguebaly, Tarek [1 ]
机构
[1] Concordia Univ, Fac Engn & Comp Sci, Concordia Inst Informat Syst Engn, Montreal, PQ H3G 2W1, Canada
关键词
Beta distribution; Mixture modeling; Bayesian analysis; MCMC; Reversible jump; Gibbs sampling; Metropolis-Hastings; Texture classification; Retrieval; GENERALIZED GAUSSIAN DENSITY; UNKNOWN NUMBER; TEXTURE CLASSIFICATION; MAXIMUM-LIKELIHOOD; EM ALGORITHM; SELECTION; COMPONENTS; RETRIEVAL; FEATURES;
D O I
10.1016/j.eswa.2011.11.122
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The use of mixture models in image and signal processing has proved to be of considerable interest in terms of both theoretical development and in their usefulness in several applications. Researchers have approached the mixture estimation and selection problem, to model complex datasets, with different techniques in the last few years. In theory, it is well-known that full Bayesian approaches, to handle this problem, are fully optimal. The Bayesian learning allows the incorporation of prior knowledge in a formal coherent way that avoids overfitting problems. In this paper, we propose a fully Bayesian approach for finite Beta mixtures learning using a reversible jump Markov chain Monte Carlo (RJMCMC) technique which simultaneously allows cluster assignments, parameters estimation, and the selection of the optimal number of clusters. The adverb "fully" is justified by the fact that all parameters of interest in our model including number of clusters and missing values are considered as random variables for which priors are specified and posteriors are approximated using RJMCMC. Our work is motivated by the fact that Beta mixtures are able to fit any unknown distributional shape and then can be considered as a useful class of flexible models to address several problems and applications involving measurements and features having well-known marked deviation from the Gaussian shape. The usefulness of the proposed approach is confirmed using synthetic mixture data, real data, and through an interesting application namely texture classification and retrieval. (C) 2011 Elsevier Ltd. All rights reserved.
引用
收藏
页码:5946 / 5959
页数:14
相关论文
共 50 条
  • [31] Development of a Fully Reversible PAH Clustering Model
    Khabazipur, Arash
    Eaves, Nickolas
    [J]. PROCEEDINGS OF THE COMBUSTION INSTITUTE, 2023, 39 (01) : 919 - 927
  • [32] Reversible jump MCMC for multi-model inference in Metabolic Flux Analysis
    Theorell, Axel
    Noeh, Katharina
    [J]. BIOINFORMATICS, 2020, 36 (01) : 232 - 240
  • [33] Model-based clustering based on sparse finite Gaussian mixtures
    Malsiner-Walli, Gertraud
    Fruehwirth-Schnatter, Sylvia
    Gruen, Bettina
    [J]. STATISTICS AND COMPUTING, 2016, 26 (1-2) : 303 - 324
  • [34] Model-based clustering based on sparse finite Gaussian mixtures
    Gertraud Malsiner-Walli
    Sylvia Frühwirth-Schnatter
    Bettina Grün
    [J]. Statistics and Computing, 2016, 26 : 303 - 324
  • [35] Frequency estimation and synchronization of frequency hopping signals based on reversible jump MCMC
    Liang, JL
    Gao, L
    Yang, SY
    [J]. ISPACS 2005: PROCEEDINGS OF THE 2005 INTERNATIONAL SYMPOSIUM ON INTELLIGENT SIGNAL PROCESSING AND COMMUNICATION SYSTEMS, 2005, : 589 - 592
  • [36] Reversible jump MCMC for inference in a deterministic individual-based model of tree growth for studying forest dynamics
    Gemoets, D.
    Barber, Jarrett
    Ogle, Kiona
    [J]. ENVIRONMETRICS, 2013, 24 (07) : 433 - 448
  • [37] A Bayesian approach for clustering and exact finite-sample model selection in longitudinal data mixtures
    Corneli, M.
    Erosheva, E.
    Qian, X.
    Lorenzi, M.
    [J]. COMPUTATIONAL STATISTICS, 2024,
  • [38] Bayesian model learning based on a parallel MCMC strategy
    Jukka Corander
    Mats Gyllenberg
    Timo Koski
    [J]. Statistics and Computing, 2006, 16 : 355 - 362
  • [39] Bayesian model learning based on a parallel MCMC strategy
    Corander, Jukka
    Gyllenberg, Mats
    Koski, Timo
    [J]. STATISTICS AND COMPUTING, 2006, 16 (04) : 355 - 362
  • [40] A Bayesian analysis of finite mixtures in the LISREL model
    Hong-Tu Zhu
    Sik-Yum Lee
    [J]. Psychometrika, 2001, 66 : 133 - 152