共 50 条
- [1] DAC-SGD: A Distributed Stochastic Gradient Descent Algorithm Based on Asynchronous Connection [J]. IIP'17: PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION PROCESSING, 2017,
- [2] Developing a Loss Prediction-based Asynchronous Stochastic Gradient Descent Algorithm for Distributed Training of Deep Neural Networks [J]. PROCEEDINGS OF THE 49TH INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2020, 2020,
- [3] ASYNCHRONOUS STOCHASTIC GRADIENT DESCENT FOR DNN TRAINING [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6660 - 6663
- [5] Distributed Stochastic Gradient Descent With Compressed and Skipped Communication [J]. IEEE ACCESS, 2023, 11 : 99836 - 99846
- [7] Faster Distributed Deep Net Training: Computation and Communication Decoupled Stochastic Gradient Descent [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4582 - 4589
- [8] Distributed Stochastic Gradient Descent with Event-Triggered Communication [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 7169 - 7178
- [9] Hinge Classification Algorithm Based on Asynchronous Gradient Descent [J]. ADVANCES ON BROAD-BAND WIRELESS COMPUTING, COMMUNICATION AND APPLICATIONS, BWCCA-2017, 2018, 12 : 459 - 468