A soft computing model based on asymmetric Gaussian mixtures and Bayesian inference

被引:0
|
作者
Shuai Fu
Nizar Bouguila
机构
[1] Concordia University,Concordia Institute for Information Systems Engineering
来源
Soft Computing | 2020年 / 24卷
关键词
Asymmetric Gaussian mixture; RJMCMC; Intrusion detection; Spam filtering; Image categorization; Dimensionality reduction;
D O I
暂无
中图分类号
学科分类号
摘要
A novel unsupervised Bayesian learning framework based on asymmetric Gaussian mixture (AGM) statistical model is proposed since AGM is shown to be more effective compared to the classic Gaussian mixture model. The Bayesian learning framework is developed by adopting sampling-based Markov chain Monte Carlo (MCMC) methodology. More precisely, the fundamental learning algorithm is a hybrid Metropolis–Hastings within Gibbs sampling solution which is integrated within a reversible jump MCMC learning framework, a self-adapted sampling-based implementation, that enables model transfer throughout the mixture parameters learning process and therefore automatically converges to the optimal number of data groups. Furthermore, in order to handle high-dimensional vectors of features, a dimensionality reduction algorithm based on mixtures of distributions is included to tackle the irrelevant and extraneous features. The performance comparison between AGM and other popular models is given, and both synthetic and real datasets extracted from challenging applications such as intrusion detection, spam filtering and image categorization are evaluated to show the merits of the proposed approach.
引用
收藏
页码:4841 / 4853
页数:12
相关论文
共 50 条
  • [1] A soft computing model based on asymmetric Gaussian mixtures and Bayesian inference
    Fu, Shuai
    Bouguila, Nizar
    [J]. SOFT COMPUTING, 2020, 24 (07) : 4841 - 4853
  • [2] Bayesian Learning of Finite Asymmetric Gaussian Mixtures
    Fu, Shuai
    Bouguila, Nizar
    [J]. RECENT TRENDS AND FUTURE TECHNOLOGY IN APPLIED INTELLIGENCE, IEA/AIE 2018, 2018, 10868 : 355 - 365
  • [3] Bayesian inference for infinite asymmetric Gaussian mixture with feature selection
    Ziyang Song
    Samr Ali
    Nizar Bouguila
    [J]. Soft Computing, 2021, 25 : 6043 - 6053
  • [4] Bayesian inference for infinite asymmetric Gaussian mixture with feature selection
    Song, Ziyang
    Ali, Samr
    Bouguila, Nizar
    [J]. SOFT COMPUTING, 2021, 25 (08) : 6043 - 6053
  • [5] Bayesian Inference for an Asymmetric Extension of the Grubbs Model
    Silva, Fabio R.
    Montenegro, Lourdes C. C.
    [J]. SIGMAE, 2014, 3 (02): : 39 - 46
  • [6] The interventional Bayesian Gaussian equivalent score for Bayesian causal inference with unknown soft interventions
    Kuipers, Jack
    Moffa, Giusi
    [J]. arXiv, 2022,
  • [7] Efficient Bayesian Inference for a Gaussian Process Density Model
    Donner, Christian
    Opper, Manfred
    [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2018, : 53 - 62
  • [8] Community Embeddings with Bayesian Gaussian Mixture Model and Variational Inference
    Begehr, Anton I. N.
    Panfilov, Peter B.
    [J]. 2022 IEEE 24TH CONFERENCE ON BUSINESS INFORMATICS (CBI 2022), VOL 2, 2022, : 88 - 96
  • [9] A Novel Approach for Gaussian Mixture Model Clustering Based on Soft Computing Method
    Gogebakan, Maruf
    [J]. IEEE ACCESS, 2021, 9 : 159987 - 160003
  • [10] Bayesian Inference Based Robust Computing on Memristor Crossbar
    Gao, Di
    Huang, Qingrong
    Zhang, Grace Li
    Yin, Xunzhao
    Li, Bing
    Schlichtmann, Ulf
    Zhuo, Cheng
    [J]. 2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 121 - 126