Model Mixing Using Bayesian Additive Regression Trees

被引:0
|
作者
Yannotty, John C. [1 ]
Santner, Thomas J. [1 ]
Furnstahl, Richard J. [1 ]
Pratola, Matthew T. [1 ]
机构
[1] Ohio State Univ, Columbus, OH 43210 USA
基金
美国国家科学基金会;
关键词
Computer experiments; Effective field theories; Model stacking; Uncertainty quantification; STACKING;
D O I
10.1080/00401706.2023.2257765
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In modern computer experiment applications, one often encounters the situation where various models of a physical system are considered, each implemented as a simulator on a computer. An important question in such a setting is determining the best simulator, or the best combination of simulators, to use for prediction and inference. Bayesian model averaging (BMA) and stacking are two statistical approaches used to account for model uncertainty by aggregating a set of predictions through a simple linear combination or weighted average. Bayesian model mixing (BMM) extends these ideas to capture the localized behavior of each simulator by defining input-dependent weights. One possibility is to define the relationship between inputs and the weight functions using a flexible nonparametric model that learns the local strengths and weaknesses of each simulator. This article proposes a BMM model based on Bayesian Additive Regression Trees (BART). The proposed methodology is applied to combine predictions from Effective Field Theories (EFTs) associated with a motivating nuclear physics application. Supplementary materials for this article are available online. Source code is available at https://github.com/jcyannotty/OpenBT.
引用
收藏
页码:196 / 207
页数:12
相关论文
共 50 条
  • [1] Bayesian Additive Regression Trees using Bayesian model averaging
    Belinda Hernández
    Adrian E. Raftery
    Stephen R Pennington
    Andrew C. Parnell
    [J]. Statistics and Computing, 2018, 28 : 869 - 890
  • [2] Bayesian Additive Regression Trees using Bayesian model averaging
    Hernandez, Belinda
    Raftery, Adrian E.
    Pennington, Stephen R.
    Parnell, Andrew C.
    [J]. STATISTICS AND COMPUTING, 2018, 28 (04) : 869 - 890
  • [3] Bayesian additive regression trees with model trees
    Prado, Estevao B.
    Moral, Rafael A.
    Parnell, Andrew C.
    [J]. STATISTICS AND COMPUTING, 2021, 31 (03)
  • [4] Bayesian additive regression trees with model trees
    Estevão B. Prado
    Rafael A. Moral
    Andrew C. Parnell
    [J]. Statistics and Computing, 2021, 31
  • [5] Bayesian additive regression trees and the General BART model
    Tan, Yaoyuan Vincent
    Roy, Jason
    [J]. STATISTICS IN MEDICINE, 2019, 38 (25) : 5048 - 5069
  • [6] Variable Selection Using Bayesian Additive Regression Trees
    Luo, Chuji
    Daniels, Michael J.
    [J]. STATISTICAL SCIENCE, 2024, 39 (02) : 286 - 304
  • [7] BART: BAYESIAN ADDITIVE REGRESSION TREES
    Chipman, Hugh A.
    George, Edward I.
    McCulloch, Robert E.
    [J]. ANNALS OF APPLIED STATISTICS, 2010, 4 (01): : 266 - 298
  • [8] Parallel Bayesian Additive Regression Trees
    Pratola, Matthew T.
    Chipman, Hugh A.
    Gattiker, James R.
    Higdon, David M.
    McCulloch, Robert
    Rust, William N.
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2014, 23 (03) : 830 - 852
  • [9] Forecasting with many predictors using Bayesian additive regression trees
    Prueser, Jan
    [J]. JOURNAL OF FORECASTING, 2019, 38 (07) : 621 - 631
  • [10] Particle Gibbs for Bayesian Additive Regression Trees
    Lakshminarayanan, Balaji
    Roy, Daniel M.
    Teh, Yee Whye
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 553 - 561