Information theoretic approach to Bayesian inference

被引:0
|
作者
Jewell, J [1 ]
机构
[1] Jet Propuls Lab, Pasadena, CA 91109 USA
关键词
D O I
暂无
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
A problem of central importance in formulating theoretical explanations of some observed process is the construction of a quantitative comparison of observations and examples within the context of a specific theory. From a Bayesian perspective, the ultimate goal is to compute the posterior probability for a theory given the observations. A direct approach to Bayesian inference can be prohibited when the likelihood itself is not known (i.e. or computationally intractable to compute, as in the case of non-linear processes with stochastic initial conditions). An information theoretic approach to this problem is developed in which a sequence of exponential family Gibbs densities are used as approximations to the likelihood. The sequence of Gibbs densities are constructed by successively matching expectation value constraints, giving a positive Gibbs Information Gain, or rate of convergence to the limiting density in the Kullback-Leibler distance sense. Evaluating the sequence of Gibbs densities for the observed data gives a sequence of Bayesian posterior densities which are successively more concentrated. It is shown that the successive confidence intervals form a decreasing sequence of subsets which include, and converge to, the true Bayesian confidence intervals. This provides a justifiable approach for extending Bayesian inference to problems where the likelihood is unknown, as "error bars" are simply larger. Furthermore, the Bayes Information Gain, defined as the rate at which the confidence intervals contract to the limiting interval, is shown to be maximized by a "greedy" approach to inference when constructing the Gibbs likelihoods.
引用
收藏
页码:433 / 448
页数:16
相关论文
共 50 条
  • [21] Bayesian inference for the information gain model
    Sven Stringer
    Denny Borsboom
    Eric-Jan Wagenmakers
    [J]. Behavior Research Methods, 2011, 43 : 297 - 309
  • [22] Controlled information integration and bayesian inference
    Juslin, Peter
    [J]. FRONTIERS IN PSYCHOLOGY, 2015, 6
  • [23] Bayesian inference from measurement information
    Lira, I
    Kyriazis, G
    [J]. METROLOGIA, 1999, 36 (03) : 163 - 169
  • [24] Information-Theoretic Exploration with Bayesian Optimization
    Bai, Shi
    Wang, Jinkun
    Chen, Fanfei
    Englot, Brendan
    [J]. 2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 1816 - 1822
  • [25] Fast Information-theoretic Bayesian Optimisation
    Ru, Binxin
    McLeod, Mark
    Granziol, Diego
    Osborne, Michael A.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [26] Information theoretic approach to information extraction
    Amati, Giambattista
    [J]. FLEXIBLE QUERY ANSWERING SYSTEMS, PROCEEDINGS, 2006, 4027 : 519 - 529
  • [27] A Bayesian Framework of Information Theoretic Metrics for Anonymity
    Pei, Han-Ru
    Wang, Zhi-Jian
    Wang, Yu
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY (CIT), 2014, : 359 - 364
  • [28] Bayesian/information theoretic model of bias learning
    Baxter, Jonathan
    [J]. Proceedings of the Annual ACM Conference on Computational Learning Theory, 1996, : 77 - 88
  • [29] Approximated Information Analysis in Bayesian Inference
    Seo, Jung In
    Kim, Yongku
    [J]. ENTROPY, 2015, 17 (03) : 1441 - 1451
  • [30] A Bayesian Framework for Information-Theoretic Probing
    Pimentel, Tiago
    Cotterell, Ryan
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2869 - 2887