Robust Distributed Bayesian Learning with Stragglers via Consensus Monte Carlo

被引:0
|
作者
Chittoor, Hari Hara Suthan [1 ]
Simeone, Osvaldo [1 ]
机构
[1] Kings Coll London, Dept Engn, KCLIP Lab, London, England
基金
欧洲研究理事会;
关键词
Distributed Bayesian learning; stragglers; Consensus Monte Carlo; grouping; coded computing;
D O I
10.1109/GLOBECOM48099.2022.10001070
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper studies distributed Bayesian learning in a setting encompassing a central server and multiple workers by focusing on the problem of mitigating the impact of stragglers. The standard one-shot, or embarrassingly parallel, Bayesian learning protocol known as consensus Monte Carlo (CMC) is generalized by proposing two straggler-resilient solutions based on grouping and coding. Two main challenges in designing straggler-resilient algorithms for CMC are the need to estimate the statistics of the workers' outputs across multiple shots, and the joint non-linear post-processing of the outputs of the workers carried out at the server. This is in stark contrast to other distributed settings like gradient coding, which only require the per-shot sum of the workers' outputs. The proposed methods, referred to as Group-based CMC (G-CMC) and Coded CMC (C-CMC), leverage redundant computing at the workers in order to enable the estimation of global posterior samples at the server based on partial outputs from the workers. Simulation results show that C-CMC may outperform G-CMC for a small number of workers, while G-CMC is generally preferable for a larger number of workers.
引用
收藏
页码:609 / 614
页数:6
相关论文
共 50 条
  • [31] Adaptive Semiparametric Bayesian Differential Equations Via Sequential Monte Carlo
    Wang, Shijia
    Ge, Shufei
    Doig, Renny
    Wang, Liangliang
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (02) : 600 - 613
  • [32] Bayesian Mixture Modelling in Geochronology via Markov Chain Monte Carlo
    Ajay Jasra
    David A. Stephens
    Kerry Gallagher
    Christopher C. Holmes
    [J]. Mathematical Geology, 2006, 38 : 269 - 300
  • [33] Bayesian mixture modelling in geochronology via Markov chain Monte Carlo
    Jasra, Ajay
    Stephens, David A.
    Gallagher, Kerry
    Holmes, Christopher C.
    [J]. MATHEMATICAL GEOLOGY, 2006, 38 (03): : 269 - 300
  • [34] Bayesian Trend Filtering via Proximal Markov Chain Monte Carlo
    Heng, Qiang
    Zhou, Hua
    Chi, Eric C.
    [J]. JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2023, 32 (03) : 938 - 949
  • [35] Bayesian system identification via Markov chain Monte Carlo techniques
    Ninness, Brett
    Henriksen, Soren
    [J]. AUTOMATICA, 2010, 46 (01) : 40 - 51
  • [36] Bayesian optimization with informative parametric models via sequential Monte Carlo
    Oliveira, Rafael
    Scalzo, Richard
    Kohn, Robert
    Cripps, Sally
    Hardman, Kyle
    Close, John
    Taghavi, Nasrin
    Lemckert, Charles
    [J]. DATA-CENTRIC ENGINEERING, 2022, 3 (01):
  • [37] Bayesian filtering for hidden Markov models via Monte Carlo methods
    Doucet, A
    Andrieu, C
    Fitzgerald, W
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING VIII, 1998, : 194 - 203
  • [38] Neurons as Monte Carlo Samplers: Bayesian Inference and Learning in Spiking Networks
    Huang, Yanping
    Rao, Rajesh P. N.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [39] Decentralized Bayesian learning with Metropolis-adjusted Hamiltonian Monte Carlo
    Vyacheslav Kungurtsev
    Adam Cobb
    Tara Javidi
    Brian Jalaian
    [J]. Machine Learning, 2023, 112 : 2791 - 2819
  • [40] Bayesian on-line learning: A sequential Monte Carlo with importance resampling
    Kurihara, T
    Nakada, Y
    Yosui, K
    Matsumoto, T
    [J]. NEURAL NETWORKS FOR SIGNAL PROCESSING XI, 2001, : 163 - 172