Approximation of Laws of Multinomial Parameters by Mixtures of Dirichlet Distributions with Applications to Bayesian Inference

被引:0
|
作者
Eugenio Regazzini
Viatcheslav V. Sazonov
机构
[1] Università di Pavia,Dipartimento di Matematica
[2] IAMI-CNR,undefined
[3] Steklov Mathematical Institute,undefined
来源
Acta Applicandae Mathematica | 1999年 / 58卷
关键词
approximation of priors and posteriors; Dirichlet distributions; elicitation of prior beliefs; Lévy metric; Prokhorov metric; random measures;
D O I
暂无
中图分类号
学科分类号
摘要
Within the framework of Bayesian inference, when observations are exchangeable and take values in a finite space X, a prior P is approximated (in the Prokhorov metric) with any precision by explicitly constructed mixtures of Dirichlet distributions. Likewise, the posteriors are approximated with some precision by the posteriors of these mixtures of Dirichlet distributions. Approximations in the uniform metric for distribution functions are also given. These results are applied to obtain a method for eliciting prior beliefs and to approximate both the predictive distribution (in the variational metric) and the posterior distribution function of ∫ψd\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\tilde p$$ \end{document} (in the Lévy metric), when \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\tilde p$$ \end{document} is a random probability having distribution P.
引用
收藏
页码:247 / 264
页数:17
相关论文
共 50 条
  • [11] Variational Bayesian dirichlet-multinomial allocation for exponential family mixtures
    Yu, Shipeng
    Yu, Kai
    Tresp, Volker
    Kriegel, Hans-Peter
    MACHINE LEARNING: ECML 2006, PROCEEDINGS, 2006, 4212 : 841 - 848
  • [12] Variational learning for Dirichlet process mixtures of Dirichlet distributions and applications
    Wentao Fan
    Nizar Bouguila
    Multimedia Tools and Applications, 2014, 70 : 1685 - 1702
  • [13] Variational learning for Dirichlet process mixtures of Dirichlet distributions and applications
    Fan, Wentao
    Bouguila, Nizar
    MULTIMEDIA TOOLS AND APPLICATIONS, 2014, 70 (03) : 1685 - 1702
  • [14] An introduction to Bayesian reference analysis: inference on the ratio of multinomial parameters
    Bernardo, JM
    Ramon, JM
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES D-THE STATISTICIAN, 1998, 47 (01) : 101 - 135
  • [15] Bayesian inference for linear dynamic models with Dirichlet process mixtures
    Caron, Francois
    Davy, Manuel
    Doucet, Arnaud
    Duflos, Emmanuel
    Vanheeghe, Philippe
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2008, 56 (01) : 71 - 84
  • [16] Bayesian semiparametric modeling and inference with mixtures of symmetric distributions
    Athanasios Kottas
    Gilbert W. Fellingham
    Statistics and Computing, 2012, 22 : 93 - 106
  • [17] Bayesian semiparametric modeling and inference with mixtures of symmetric distributions
    Kottas, Athanasios
    Fellingham, Gilbert W.
    STATISTICS AND COMPUTING, 2012, 22 (01) : 93 - 106
  • [18] Sparse Count Data Clustering Using an Exponential Approximation to Generalized Dirichlet Multinomial Distributions
    Zamzami, Nuha
    Bouguila, Nizar
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (01) : 89 - 102
  • [19] Variational Bayesian inference for a Dirichlet process mixture of beta distributions and application
    Lai, Yuping
    Ping, Yuan
    Xiao, Ke
    Hao, Bin
    Zhang, Xiufeng
    NEUROCOMPUTING, 2018, 278 : 23 - 33
  • [20] MIXTURES OF DIRICHLET PROCESSES WITH APPLICATIONS TO BAYESIAN NONPARAMETRIC PROBLEMS
    ANTONIAK, CE
    ANNALS OF STATISTICS, 1974, 2 (06): : 1152 - 1174