共 50 条
- [12] Anytime Exploitation of Stragglers in Synchronous Stochastic Gradient Descent [J]. 2017 16TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2017, : 141 - 146
- [14] Faster Distributed Deep Net Training: Computation and Communication Decoupled Stochastic Gradient Descent [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4582 - 4589
- [16] Communication-Efficient Local Stochastic Gradient Descent for Scalable Deep Learning [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 718 - 727
- [17] Annealed Gradient Descent for Deep Learning [J]. UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2015, : 652 - 661
- [20] Distributed Stochastic Gradient Descent With Compressed and Skipped Communication [J]. IEEE ACCESS, 2023, 11 : 99836 - 99846