Approximation of Laws of Multinomial Parameters by Mixtures of Dirichlet Distributions with Applications to Bayesian Inference

被引:0
|
作者
Eugenio Regazzini
Viatcheslav V. Sazonov
机构
[1] Università di Pavia,Dipartimento di Matematica
[2] IAMI-CNR,undefined
[3] Steklov Mathematical Institute,undefined
来源
Acta Applicandae Mathematica | 1999年 / 58卷
关键词
approximation of priors and posteriors; Dirichlet distributions; elicitation of prior beliefs; Lévy metric; Prokhorov metric; random measures;
D O I
暂无
中图分类号
学科分类号
摘要
Within the framework of Bayesian inference, when observations are exchangeable and take values in a finite space X, a prior P is approximated (in the Prokhorov metric) with any precision by explicitly constructed mixtures of Dirichlet distributions. Likewise, the posteriors are approximated with some precision by the posteriors of these mixtures of Dirichlet distributions. Approximations in the uniform metric for distribution functions are also given. These results are applied to obtain a method for eliciting prior beliefs and to approximate both the predictive distribution (in the variational metric) and the posterior distribution function of ∫ψd\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\tilde p$$ \end{document} (in the Lévy metric), when \documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document} $$\tilde p$$ \end{document} is a random probability having distribution P.
引用
收藏
页码:247 / 264
页数:17
相关论文
共 50 条
  • [31] Bayesian Nonparametric Models of Circular Variables Based on Dirichlet Process Mixtures of Normal Distributions
    Gabriel Nuñez-Antonio
    María Concepción Ausín
    Michael P. Wiper
    Journal of Agricultural, Biological, and Environmental Statistics, 2015, 20 : 47 - 64
  • [32] Robust Bayesian hierarchical modeling and inference using scale mixtures of normal distributions
    Ouyang, Linhan
    Zhu, Shichao
    Ye, Keying
    Park, Chanseok
    Wang, Min
    IISE TRANSACTIONS, 2022, 54 (07) : 659 - 671
  • [33] Bayesian inference of biochemical kinetic parameters using the linear noise approximation
    Michał Komorowski
    Bärbel Finkenstädt
    Claire V Harper
    David A Rand
    BMC Bioinformatics, 10
  • [34] Bayesian inference of biochemical kinetic parameters using the linear noise approximation
    Komorowski, Michal
    Finkenstadt, Barbel
    Harper, Claire V.
    Rand, David A.
    BMC BIOINFORMATICS, 2009, 10
  • [35] Bayesian inference by reversible jump MCMC for clustering based on finite generalized inverted Dirichlet mixtures
    Bourouis, Sami
    Al-Osaimi, Faisal R.
    Bouguila, Nizar
    Sallay, Hassen
    Aldosari, Fahd
    Al Mashrgy, Mohamed
    SOFT COMPUTING, 2019, 23 (14) : 5799 - 5813
  • [36] Bayesian inference by reversible jump MCMC for clustering based on finite generalized inverted Dirichlet mixtures
    Sami Bourouis
    Faisal R. Al-Osaimi
    Nizar Bouguila
    Hassen Sallay
    Fahd Aldosari
    Mohamed Al Mashrgy
    Soft Computing, 2019, 23 : 5799 - 5813
  • [37] Bayesian mixtures of common factor analyzers: Model, variational inference, and applications
    Wei, Xin
    Li, Chunguang
    SIGNAL PROCESSING, 2013, 93 (11) : 2894 - 2905
  • [39] Natural Estimates of the Accuracy of Approximation of the Distributions of Random Sums by Location Mixtures of Stable Laws
    T. R. Kashaev
    V. Yu. Korolev
    Journal of Mathematical Sciences, 2004, 123 (1) : 3741 - 3750
  • [40] Bayesian inference for mixtures of von Mises distributions using reversible jump MCMC sampler
    Mulder, Kees
    Jongsma, Pieter
    Klugkist, Irene
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (09) : 1539 - 1556