共 50 条
- [1] An efficient, distributed stochastic gradient descent algorithm for deep-learning applications 2017 46TH INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING (ICPP), 2017, : 11 - 20
- [2] A DAG Model of Synchronous Stochastic Gradient Descent in Distributed Deep Learning 2018 IEEE 24TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS 2018), 2018, : 425 - 432
- [3] Leader Stochastic Gradient Descent for Distributed Training of Deep Learning Models ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [6] Adaptive Stochastic Gradient Descent for Deep Learning on Heterogeneous CPU plus GPU Architectures 2021 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS (IPDPSW), 2021, : 6 - 15
- [7] Hierarchical Heterogeneous Cluster Systems for Scalable Distributed Deep Learning 2024 IEEE 27TH INTERNATIONAL SYMPOSIUM ON REAL-TIME DISTRIBUTED COMPUTING, ISORC 2024, 2024,
- [10] Machine Learning at the Wireless Edge: Distributed Stochastic Gradient Descent Over-the-Air 2019 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2019, : 1432 - 1436