Bayesian prediction of a density function in terms of e-mixture

被引:9
|
作者
Yanagimoto, Takemi [2 ]
Ohnishi, Toshio [1 ]
机构
[1] Inst Stat Math, Minato Ku, Tokyo 1068569, Japan
[2] Chuo Univ, Dept Ind & Syst Engn, Bunkyo Ku, Tokyo 1128551, Japan
关键词
Conjugate prior; DIC; Dual structure; Jeffreys' prior; Pythagorean relationship; Plug-in predictor; EXPONENTIAL-FAMILIES; CONJUGATE PRIORS; I-PROJECTIONS; FIT; EXTENSIONS; MODEL;
D O I
10.1016/j.jspi.2009.02.005
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The optimum Bayesian predictor under the e-divergence loss is proposed and discussed. Notable dualistic structure is observed between the proposed predictor and the optimum predictor under the m-divergence loss, the latter of which is dominantly discussed in the existing literature. An advantage of the proposed optimum predictor is that it is estimative, when the sampling density is in the exponential family. Potential advantages of the proposed predictor over its dual one are discussed, which include the shrinkage estimator and the Bayesian model selection criterion DIC (deviance information criterion). Further, we emphasize potential usefulness of the use of Jeffreys' prior. (C) 2009 Elsevier B.V. All rights reserved.
引用
收藏
页码:3064 / 3075
页数:12
相关论文
共 50 条
  • [1] Non-parametric e-mixture of Density Functions
    Hino, Hideitsu
    Takano, Ken
    Akaho, Shotaro
    Murata, Noboru
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 3 - 10
  • [2] Nonparametric e-Mixture Estimation
    Takano, Ken
    Hino, Hideitsu
    Akaho, Shotaro
    Murata, Noboru
    [J]. NEURAL COMPUTATION, 2016, 28 (12) : 2687 - 2725
  • [3] EQUATION OF STATE FOR HOT AND DENSE N, P, E-MIXTURE WITH ZERO CHARGE-DENSITY
    ELEID, MF
    HILF, ER
    [J]. ASTRONOMY & ASTROPHYSICS, 1977, 57 (1-2) : 243 - 249
  • [4] Improved gaussian mixture density estimates using Bayesian penalty terms and network averaging
    Ormoneit, D
    Tresp, V
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 542 - 548
  • [5] Bayesian training of Mixture Density Networks
    Hjorth, LU
    Nabney, IT
    [J]. IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL IV, 2000, : 455 - 460
  • [6] Probability density function of ocean noise based on a variational Bayesian Gaussian mixture model
    Zhang, Ying
    Yang, Kunde
    Yang, Qiulong
    [J]. JOURNAL OF THE ACOUSTICAL SOCIETY OF AMERICA, 2020, 147 (04): : 2087 - 2097
  • [7] Bayesian mixture modeling for spectral density estimation
    Cadonna, Annalisa
    Kottas, Athanasios
    Prado, Raquel
    [J]. STATISTICS & PROBABILITY LETTERS, 2017, 125 : 189 - 195
  • [8] Bayesian Mixture Labeling by Highest Posterior Density
    Yao, Weixin
    Lindsay, Bruce G.
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2009, 104 (486) : 758 - 767
  • [9] Prediction with the dynamic Bayesian gamma mixture model
    Oikonomou, KN
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART A-SYSTEMS AND HUMANS, 1997, 27 (04): : 529 - 542
  • [10] Centre of Symmetry Prediction with the Exact Probability Density Function of |E|.
    Kowiel, Marcin
    Gzella, Andrzej K.
    Shmueli, Uri
    [J]. ACTA CRYSTALLOGRAPHICA A-FOUNDATION AND ADVANCES, 2013, 69 : S460 - S460