共 50 条
- [41] Distributed Framework for Accelerating Training of Deep Learning Models through Prioritization 2021 IEEE INTERNATIONAL CONFERENCE ON CLOUD ENGINEERING, IC2E 2021, 2021, : 201 - 209
- [42] Efficient Flow Scheduling in Distributed Deep Learning Training with Echelon Formation THE 21ST ACM WORKSHOP ON HOT TOPICS IN NETWORKS, HOTNETS 2022, 2022, : 93 - 100
- [43] Leader Stochastic Gradient Descent for Distributed Training of Deep Learning Models ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [45] Exploring the Effects of Silent Data Corruption in Distributed Deep Learning Training 2022 IEEE 34TH INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE AND HIGH PERFORMANCE COMPUTING (SBAC-PAD 2022), 2022, : 21 - 30
- [46] BK.Synapse: A scalable distributed training framework for deep learning SOICT 2019: PROCEEDINGS OF THE TENTH INTERNATIONAL SYMPOSIUM ON INFORMATION AND COMMUNICATION TECHNOLOGY, 2019, : 43 - 48
- [47] Deployment Service for Scalable Distributed Deep Learning Training on Multiple Clouds CLOSER: PROCEEDINGS OF THE 11TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND SERVICES SCIENCE, 2021, : 135 - 142
- [49] An Empirical Study of Distributed Deep Learning Training on Edge (Student Abstract) THIRTY-EIGTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 21, 2024, : 23590 - 23591
- [50] Optimizing on -demand GPUs in the Cloud for Deep Learning App ica,,ions Training 2019 4TH INTERNATIONAL CONFERENCE ON COMPUTING, COMMUNICATIONS AND SECURITY (ICCCS), 2019,