Asymptotically minimax Bayesian predictive densities for multinomial models

被引:5
|
作者
Komaki, Fumiyasu [1 ,2 ]
机构
[1] Univ Tokyo, Dept Math Informat, Grad Sch Informat Sci & Technol, Bunkyo Ku, Tokyo 1138656, Japan
[2] RIKEN Brain Sci Inst, Wako, Saitama 3510198, Japan
来源
关键词
Dirichlet prior; Jeffreys prior; Kullback-Leibler divergence; latent information prior; reference prior; DISTRIBUTIONS; PRIORS;
D O I
10.1214/12-EJS700
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
One-step ahead prediction for the multinomial model is considered. The performance of a predictive density is evaluated by the average Kullback-Leibler divergence from the true density to the predictive density. Asymptotic approximations of risk functions of Bayesian predictive densities based on Dirichlet priors are obtained. It is shown that a Bayesian predictive density based on a specific Dirichlet prior is asymptotically minimax. The asymptotically minimax prior is different from known objective priors such as the Jeffreys prior or the uniform prior.
引用
收藏
页码:934 / 957
页数:24
相关论文
共 50 条