共 50 条
- [1] Gradient Coding: Avoiding Stragglers in Distributed Learning [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
- [4] Balancing Stragglers Against Staleness in Distributed Deep Learning [J]. 2018 IEEE 25TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING (HIPC), 2018, : 12 - 21
- [5] Adaptive Distributed Stochastic Gradient Descent for Minimizing Delay in the Presence of Stragglers [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4262 - 4266
- [6] Speeding Up Distributed Gradient Descent by Utilizing Non-persistent Stragglers [J]. 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 2729 - 2733
- [7] Grouping Synchronous to Eliminate Stragglers with Edge Computing in Distributed Deep Learning [J]. 19TH IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED PROCESSING WITH APPLICATIONS (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2021), 2021, : 429 - 436
- [8] Robust Distributed Bayesian Learning with Stragglers via Consensus Monte Carlo [J]. 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 609 - 614
- [10] Stragglers in Distributed Matrix Multiplication [J]. JOB SCHEDULING STRATEGIES FOR PARALLEL PROCESSING, JSSPP 2023, 2023, 14283 : 74 - 96