Generalization of Jeffreys divergence-based priors for Bayesian hypothesis testing

被引:30
|
作者
Bayarri, M. J. [2 ]
Garcia-Donato, G. [1 ]
机构
[1] Univ Castilla La Mancha, Dept Econ & Finance, Albacete 02071, Spain
[2] Univ Valencia, E-46003 Valencia, Spain
基金
美国国家科学基金会;
关键词
Bayes factors; Information consistency; Intrinsic priors; Irregular models; Kullback-Leibler divergence; Mixture models;
D O I
10.1111/j.1467-9868.2008.00667.x
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
We introduce objective proper prior distributions for hypothesis testing and model selection based on measures of divergence between the competing models; we call them divergence-based (DB) priors. DB priors have simple forms and desirable properties like information (finite sample) consistency and are often similar to other existing proposals like intrinsic priors. Moreover, in normal linear model scenarios, they reproduce the Jeffreys-Zellner-Siow priors exactly, Most importantly, in challenging scenarios such as irregular models and mixture models, DB priors are well defined and very reasonable, whereas alternative proposals are not. We derive approximations to the DB priors as well as Markov chain Monte Carlo and asymptotic expressions for the associated Bayes factors.
引用
收藏
页码:981 / 1003
页数:23
相关论文
共 50 条
  • [1] Objective Bayesian testing for the correlation coefficient under divergence-based priors
    Peng, Bo
    Wang, Min
    [J]. AMERICAN STATISTICIAN, 2021, 75 (01): : 41 - 51
  • [2] Objective Bayesian testing on the common mean of several normal distributions under divergence-based priors
    Kang, Sang Gil
    Lee, Woo Dong
    Kim, Yongku
    [J]. COMPUTATIONAL STATISTICS, 2017, 32 (01) : 71 - 91
  • [3] Objective Bayesian testing on the common mean of several normal distributions under divergence-based priors
    Sang Gil Kang
    Woo Dong Lee
    Yongku Kim
    [J]. Computational Statistics, 2017, 32 : 71 - 91
  • [4] Divergence-based estimation and testing with misclassified data
    Landaburu, E
    Morales, D
    Pardo, L
    [J]. STATISTICAL PAPERS, 2005, 46 (03) : 397 - 409
  • [5] Divergence-based estimation and testing with misclassified data
    E. Landaburu
    D. Morales
    L. Pardo
    [J]. Statistical Papers, 2005, 46 : 397 - 409
  • [6] On Divergence Measures Leading to Jeffreys and Other Reference Priors
    Liu, Ruitao
    Chakrabarti, Arijit
    Samanta, Tapas
    Ghosh, Jayanta. K.
    Ghosh, Malay
    [J]. BAYESIAN ANALYSIS, 2014, 9 (02): : 331 - 369
  • [7] Noninformative priors, credible sets and Bayesian hypothesis testing for the intraclass model
    Ghosh, M
    Heo, J
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2003, 112 (1-2) : 133 - 146
  • [8] DIVERGENCE-BASED ESTIMATION AND TESTING OF STATISTICAL-MODELS OF CLASSIFICATION
    MENENDEZ, M
    MORALES, D
    PARDO, L
    VAJDA, I
    [J]. JOURNAL OF MULTIVARIATE ANALYSIS, 1995, 54 (02) : 329 - 354
  • [9] Divergence-Based Introgression Polarization
    Forsythe, Evan S.
    Sloan, Daniel B.
    Beilstein, Mark A.
    [J]. GENOME BIOLOGY AND EVOLUTION, 2020, 12 (04): : 463 - 478
  • [10] DEEP DIVERGENCE-BASED CLUSTERING
    Kampffmeyer, M.
    Lokse, S.
    Bianchi, F. M.
    Livi, L.
    Salberg, A. -B.
    Jenssen, R.
    [J]. 2017 IEEE 27TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, 2017,