On Default Priors for Robust Bayesian Estimation with Divergences

被引:0
|
作者
Nakagawa, Tomoyuki [1 ]
Hashimoto, Shintaro [2 ]
机构
[1] Tokyo Univ Sci, Dept Informat Sci, Chiba 2788510, Japan
[2] Hiroshima Univ, Dept Math, Hiroshima 7398521, Japan
关键词
divergence; moment matching prior; reference prior; robust estimation; INFERENCE; MODEL;
D O I
10.3390/e23010029
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
This paper presents objective priors for robust Bayesian estimation against outliers based on divergences. The minimum gamma-divergence estimator is well-known to work well in estimation against heavy contamination. The robust Bayesian methods by using quasi-posterior distributions based on divergences have been also proposed in recent years. In the objective Bayesian framework, the selection of default prior distributions under such quasi-posterior distributions is an important problem. In this study, we provide some properties of reference and moment matching priors under the quasi-posterior distribution based on the gamma-divergence. In particular, we show that the proposed priors are approximately robust under the condition on the contamination distribution without assuming any conditions on the contamination ratio. Some simulation studies are also presented.
引用
收藏
页码:1 / 19
页数:19
相关论文
共 50 条
  • [1] Default priors for Bayesian and frequentist inference
    Fraser, D. A. S.
    Reid, N.
    Marras, E.
    Yi, G. Y.
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2010, 72 : 631 - 654
  • [2] On Bayesian estimation via divergences
    Cherfi, Mohamed
    [J]. COMPTES RENDUS MATHEMATIQUE, 2014, 352 (09) : 749 - 754
  • [3] Distributionally Robust Bayesian Optimization with φ-divergences
    Husain, Hisham
    Vu Nguyen
    van den Hengel, Anton
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] Default priors for density estimation with mixture models
    Griffin, J. E.
    [J]. BAYESIAN ANALYSIS, 2010, 5 (01): : 45 - 64
  • [5] A new approach to default priors and robust Bayes methodology
    Delampady, M
    Das Gupta, A
    Casella, G
    Rubin, H
    Strawderman, WE
    [J]. CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 2001, 29 (03): : 437 - 450
  • [6] Default Bayesian analysis with global-local shrinkage priors
    Bhadra, Anindya
    Datta, Jyotishka
    Polson, Nicholas G.
    Willard, Brandon
    [J]. BIOMETRIKA, 2016, 103 (04) : 955 - 969
  • [7] SAMPLING, FEASIBILITY, AND PRIORS IN BAYESIAN ESTIMATION
    Chorin, Alexandre J.
    Lu, Fei
    Miller, Robert N.
    Morzfeld, Matthias
    Tu, Xuemin
    [J]. DISCRETE AND CONTINUOUS DYNAMICAL SYSTEMS, 2016, 36 (08) : 4227 - 4246
  • [8] Robust Bayesian analysis with partially exchangeable priors
    Ghosh, M
    Kim, DH
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2002, 102 (01) : 99 - 107
  • [9] Objective Bayesian estimation of the probability of default
    Kazianka, Hannes
    [J]. JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES C-APPLIED STATISTICS, 2016, 65 (01) : 1 - 27
  • [10] Default Bayesian Estimation of the Fundamental Frequency
    Nielsen, Jesper Kjaer
    Christensen, Mads Graesboll
    Jensen, Soren Holdt
    [J]. IEEE TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2013, 21 (03): : 598 - 610