Bayesian estimation of Dirichlet mixture model with variational inference

被引:89
|
作者
Ma, Zhanyu [1 ]
Rana, Pravin Kumar [2 ]
Taghia, Jalil [2 ]
Flierl, Markus [2 ]
Leijon, Arne [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Pattern Recognit & Intelligent Syst Lab, Beijing 100876, Peoples R China
[2] KTH Royal Inst Technol, Sch Elect & Engn, SE-10044 Stockholm, Sweden
关键词
Bayesian estimation; Variational inference; Extended factorized approximation; Relative convexity; Dirichlet distribution; Gamma prior; Mixture modeling; LSF quantization; Multiview depth image enhancement; VECTOR QUANTIZATION; IMAGE; COMPRESSION; ALGORITHMS;
D O I
10.1016/j.patcog.2014.04.002
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In statistical modeling, parameter estimation is an essential and challengeable task. Estimation of the parameters in the Dirichlet mixture model (DMM) is analytically intractable, due to the integral expressions of the gamma function and its corresponding derivatives. We introduce a Bayesian estimation strategy to estimate the posterior distribution of the parameters in DMM. By assuming the gamma distribution as the prior to each parameter, we approximate both the prior and the posterior distribution of the parameters with a product of several mutually independent gamma distributions. The extended factorized approximation method is applied to introduce a single lower-bound to the variational objective function and an analytically tractable estimation solution is derived. Moreover, there is only one function that is maximized during iterations and, therefore, the convergence of the proposed algorithm is theoretically guaranteed. With synthesized data, the proposed method shows the advantages over the EM-based method and the previously proposed Bayesian estimation method. With two important multimedia signal processing applications, the good performance of the proposed Bayesian estimation method is demonstrated. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:3143 / 3157
页数:15
相关论文
共 50 条
  • [41] Variational Bayesian inference for the probabilistic model of power load
    Dong, Zijian
    Wang, Yunpeng
    Zhao, Jing
    [J]. IET GENERATION TRANSMISSION & DISTRIBUTION, 2014, 8 (11) : 1860 - 1868
  • [42] Variational inference with Gaussian mixture model and householder flow
    Liu, GuoJun
    Liu, Yang
    Guo, MaoZu
    Li, Peng
    Li, MingYu
    [J]. NEURAL NETWORKS, 2019, 109 : 43 - 55
  • [43] Variational Inference for Dirichlet Process Mixtures
    Blei, David M.
    Jordan, Michael I.
    [J]. BAYESIAN ANALYSIS, 2006, 1 (01): : 121 - 143
  • [44] Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference
    Chen, Peng
    Zabaras, Nicholas
    Bilionis, Ilias
    [J]. JOURNAL OF COMPUTATIONAL PHYSICS, 2015, 284 : 291 - 333
  • [45] Evolutionary continuous optimization by distribution estimation with variational Bayesian independent component analyzers mixture model
    Cho, DY
    Zhang, BT
    [J]. PARALLEL PROBLEM SOLVING FROM NATURE - PPSN VIII, 2004, 3242 : 212 - 221
  • [46] A computational approach for full nonparametric Bayesian inference under Dirichlet process mixture models
    Gelfand, AE
    Kottas, A
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2002, 11 (02) : 289 - 305
  • [47] Importance-Weighted Variational Inference Model Estimation for Offline Bayesian Model-Based Reinforcement Learning
    Hishinuma, Toru
    Senda, Kei
    [J]. IEEE ACCESS, 2023, 11 : 145579 - 145590
  • [48] A tutorial on variational Bayesian inference
    Charles W. Fox
    Stephen J. Roberts
    [J]. Artificial Intelligence Review, 2012, 38 : 85 - 95
  • [49] A tutorial on variational Bayesian inference
    Fox, Charles W.
    Roberts, Stephen J.
    [J]. ARTIFICIAL INTELLIGENCE REVIEW, 2012, 38 (02) : 85 - 95
  • [50] Improved iterative joint detection and estimation through variational Bayesian inference
    Sadough, S. M-S
    Modarresi, M.
    [J]. AEU-INTERNATIONAL JOURNAL OF ELECTRONICS AND COMMUNICATIONS, 2012, 66 (05) : 380 - 383