Variational Inference of Finite Asymmetric Gaussian Mixture Models

被引:0
|
作者
Song, Ziyang [1 ]
Bregu, Ornela [1 ]
Ali, Samr [2 ]
Bouguila, Nizar [1 ]
机构
[1] Concordia Univ, Concordia Inst Informat Syst Engn CIISE, Montreal, PQ, Canada
[2] Concordia Univ, Dept Elect & Comp Engn ECE, Montreal, PQ, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
Mixture Model; Variational Bayes Inference; Asymmetric Gaussian Distribution; Background Subtraction; DIRICHLET MIXTURE; DISTRIBUTIONS; SELECTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Mixture models are a popular unsupervised learning technique useful for discovering homogeneous clusters in unlabeled data. A key research problem lies in the accurate and efficient determination of their associated parameters. Variational inference has recently risen as a prominent parameter learning approach. Hence, in this research, we propose a variational Bayes learning framework for asymmetric Gaussian mixture model. Unlike Gaussian mixture models, these models incorporate the asymmetric shape of data and are adaptive to different conditions in real-word image processing domains. Experimental results show the merit of the proposed approach.
引用
收藏
页码:2448 / 2454
页数:7
相关论文
共 50 条
  • [1] Variational Inference of Finite Generalized Gaussian Mixture Models
    Amudala, Srikanth
    Ali, Samr
    Najar, Fatma
    Bouguila, Nizar
    [J]. 2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2433 - 2439
  • [2] Trust-region variational inference with gaussian mixture models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. Journal of Machine Learning Research, 2020, 21
  • [3] Trust-Region Variational Inference with Gaussian Mixture Models
    Arenz, Oleg
    Zhong, Mingjun
    Neumann, Gerhard
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [4] Variational Inference of Infinite Generalized Gaussian Mixture Models with Feature Selection
    Amudala, Srikanth
    Ali, Samr
    Bouguila, Nizar
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 120 - 127
  • [5] Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Kock, Lucas
    Klein, Nadja
    Nott, David J.
    [J]. STATISTICS AND COMPUTING, 2022, 32 (05)
  • [6] Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Lucas Kock
    Nadja Klein
    David J. Nott
    [J]. Statistics and Computing, 2022, 32
  • [7] Variational learning for Gaussian mixture models
    Nasios, Nikolaos
    Bors, Adrian G.
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2006, 36 (04): : 849 - 862
  • [8] Correction to : Variational inference and sparsity in high-dimensional deep Gaussian mixture models
    Lucas Kock
    Nadja Klein
    David J.Nott
    [J]. Statistics and Computing, 2023, 33
  • [9] Online variational inference on finite multivariate Beta mixture models for medical applications
    Manouchehri, Narges
    Kalra, Meeta
    Bouguila, Nizar
    [J]. IET IMAGE PROCESSING, 2021, 15 (09) : 1869 - 1882
  • [10] Variational inference with Gaussian mixture model and householder flow
    Liu, GuoJun
    Liu, Yang
    Guo, MaoZu
    Li, Peng
    Li, MingYu
    [J]. NEURAL NETWORKS, 2019, 109 : 43 - 55