Parallelizing MCMC with Random Partition Trees

被引:0
|
作者
Wang, Xiangyu [1 ]
Guo, Fangjian [2 ]
Heller, Katherine A. [1 ]
Dunson, David B. [1 ]
机构
[1] Duke Univ, Dept Stat Sci, Durham, NC 27706 USA
[2] Duke Univ, Dept Comp Sci, Durham, NC 27706 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The modern scale of data has brought new challenges to Bayesian inference. In particular, conventional MCMC algorithms are computationally very expensive for large data sets. A promising approach to solve this problem is embarrassingly parallel MCMC (EP-MCMC), which first partitions the data into multiple subsets and runs independent sampling algorithms on each subset. The subset posterior draws are then aggregated via some combining rules to obtain the final approximation. Existing EP-MCMC algorithms are limited by approximation accuracy and difficulty in resampling. In this article, we propose a new EP-MCMC algorithm PART that solves these problems. The new algorithm applies random partition trees to combine the subset posterior draws, which is distribution-free, easy to re-sample from and can adapt to multiple scales. We provide theoretical justification and extensive experiments illustrating empirical performance.
引用
收藏
页数:9
相关论文
共 50 条