共 50 条
- [1] Leader Stochastic Gradient Descent for Distributed Training of Deep Learning Models [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [3] Distributed Stochastic Gradient Descent With Compressed and Skipped Communication [J]. IEEE ACCESS, 2023, 11 : 99836 - 99846
- [5] Distributed Stochastic Gradient Descent with Event-Triggered Communication [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7169 - 7178
- [6] A DAG Model of Synchronous Stochastic Gradient Descent in Distributed Deep Learning [J]. 2018 IEEE 24TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS 2018), 2018, : 425 - 432
- [7] Bayesian Distributed Stochastic Gradient Descent [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
- [9] An efficient, distributed stochastic gradient descent algorithm for deep-learning applications [J]. 2017 46TH INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING (ICPP), 2017, : 11 - 20
- [10] Communication-Efficient Local Stochastic Gradient Descent for Scalable Deep Learning [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 718 - 727