Merging experts' opinions: A Bayesian hierarchical model with mixture of prior distributions

被引:10
|
作者
Rufo, M. J. [1 ]
Perez, C. J. [1 ]
Martin, J. [1 ]
机构
[1] Univ Extremadura, Dept Matemat, Escuela Politecn, Caceres 10071, Spain
关键词
Bayesian analysis; Conjugate prior distributions; Exponential families; Prior mixtures; Kullback-Leibler divergence; NATURAL EXPONENTIAL-FAMILIES; QUADRATIC VARIANCE FUNCTIONS; CONJUGATE DISTRIBUTION; APPROXIMATING PRIORS; INFORMATION;
D O I
10.1016/j.ejor.2010.04.005
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper, a general approach is proposed to address a full Bayesian analysis for the class of quadratic natural exponential families in the presence of several expert sources of prior information. By expressing the opinion of each expert as a conjugate prior distribution, a mixture model is used by the decision maker to arrive at a consensus of the sources. A hyperprior distribution on the mixing parameters is considered and a procedure based on the expected Kullback-Leibler divergence is proposed to analytically calculate the hyperparameter values. Next, the experts' prior beliefs are calibrated with respect to the combined posterior belief over the quantity of interest by using expected Kullback-Leibler divergences, which are estimated with a computationally low-cost method. Finally, it is remarkable that the proposed approach can be easily applied in practice, as it is shown with an application. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:284 / 289
页数:6
相关论文
共 50 条
  • [1] SPARSE BAYESIAN HIERARCHICAL MIXTURE OF EXPERTS
    Mossavat, Iman
    Amft, Oliver
    [J]. 2011 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2011, : 653 - 656
  • [2] Model Selection of Bayesian Hierarchical Mixture of Experts based on Variational Inference
    Iikubo, Yuji
    Horii, Shunsuke
    Matsushima, Toshiyasu
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 3474 - 3479
  • [3] Sparse Bayesian Hierarchical Mixture of Experts and Variational Inference
    Iikubo, Yuji
    Horii, Shunsuke
    Matsushima, Toshiyasu
    [J]. PROCEEDINGS OF 2018 INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY AND ITS APPLICATIONS (ISITA2018), 2018, : 60 - 64
  • [4] Mixture of experts classification using a hierarchical mixture model
    Titsias, MK
    Likas, A
    [J]. NEURAL COMPUTATION, 2002, 14 (09) : 2221 - 2244
  • [5] Optimal model inference for Bayesian mixture of experts
    Ueda, N
    Ghahramani, Z
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING X, VOLS 1 AND 2, PROCEEDINGS, 2000, : 145 - 154
  • [6] Optimal model inference for Bayesian mixture of experts
    Ueda, Naonori
    Ghahramani, Zoubin
    [J]. Neural Networks for Signal Processing - Proceedings of the IEEE Workshop, 2000, 1 : 145 - 154
  • [7] MODELING EXPERTS' OPINIONS BY USING BAYESIAN MIXTURE MODELS FOR A SUBCLASS OF THE EXPONENTIAL FAMILY
    Rufo, M. J.
    Martin, J.
    Perez, C. J.
    [J]. PAKISTAN JOURNAL OF STATISTICS, 2009, 25 (04): : 595 - 613
  • [8] Robust Bayesian inference for the censored mixture of experts model using heavy-tailed distributions
    Mirfarah, Elham
    Naderi, Mehrdad
    Lin, Tsung-, I
    Wang, Wan-Lun
    [J]. ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2024,
  • [9] Hierarchical Routing Mixture of Experts
    Zhao, Wenbo
    Gao, Yang
    Memon, Shahan Ali
    Raj, Bhiksha
    Singh, Rita
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7900 - 7906
  • [10] A similarity-based Bayesian mixture-of-experts model
    Tianfang Zhang
    Rasmus Bokrantz
    Jimmy Olsson
    [J]. Statistics and Computing, 2023, 33