共 50 条
On Bayesian A- and D-Optimal Experimental Designs in Infinite Dimensions
被引:46
|作者:
Alexanderian, Alen
[1
,2
]
Gloor, Philip J.
[3
]
Ghattas, Omar
[1
,4
,5
]
机构:
[1] Univ Texas Austin, Inst Computat Engn & Sci, Austin, TX 78712 USA
[2] North Carolina State Univ, Dept Math, Raleigh, NC 27695 USA
[3] US Naval Acad, Dept Math, Annapolis, MD 21402 USA
[4] Univ Texas Austin, Dept Geol Sci, Austin, TX 78712 USA
[5] Univ Texas Austin, Dept Mech Engn, Austin, TX 78712 USA
来源:
基金:
美国国家科学基金会;
关键词:
Bayesian inference in Hilbert space;
Gaussian measure;
Kullback-Leibler divergence;
Bayesian optimal experimental design;
expected information gain;
Bayes risk;
INVERSE PROBLEMS;
D O I:
10.1214/15-BA969
中图分类号:
O1 [数学];
学科分类号:
0701 ;
070101 ;
摘要:
We consider Bayesian linear inverse problems in infinite-dimensional separable Hilbert spaces, with a Gaussian prior measure and additive Gaussian noise model, and provide an extension of the concept of Bayesian D-optimality to the infinite-dimensional case. To this end, we derive the infinite-dimensional version of the expression for the Kullback-Leibler divergence from the posterior measure to the prior measure, which is subsequently used to derive the expression for the expected information gain. We also study the notion of Bayesian A-optimality in the infinite-dimensional setting, and extend the well known (in the finite-dimensional case) equivalence of the Bayes risk of the MAP estimator with the trace of the posterior covariance, for the Gaussian linear case, to the infinite-dimensional Hilbert space case.
引用
收藏
页码:671 / 695
页数:25
相关论文