共 50 条
- [1] A Communication-Efficient Distributed Gradient Clipping Algorithm for Training Deep Neural Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [3] Training Acceleration for Deep Neural Networks: A Hybrid Parallelization Strategy 2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 1165 - 1170
- [4] Privacy-preserving and Communication-efficient Convolutional Neural Network Prediction Framework in Mobile Cloud Computing KSII TRANSACTIONS ON INTERNET AND INFORMATION SYSTEMS, 2021, 15 (12): : 4345 - 4363
- [5] Parallel Deep Convolutional Neural Network Training by Exploiting the Overlapping of Computation and Communication 2017 IEEE 24TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING (HIPC), 2017, : 183 - 192
- [6] Communication-Efficient Quantized SGD for Learning Polynomial Neural Network 2021 IEEE INTERNATIONAL PERFORMANCE, COMPUTING, AND COMMUNICATIONS CONFERENCE (IPCCC), 2021,
- [7] A Communication-Efficient Model of Sparse Neural Network for Distributed Intelligence 2016 IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (INFOCOM WKSHPS), 2016,
- [8] Upgrade of Deep Neural Network-based Optical Monitors by Communication-Efficient Federated Learning 2023 OPTICAL FIBER COMMUNICATIONS CONFERENCE AND EXHIBITION, OFC, 2023,
- [9] Communication-Efficient Distributed Stochastic AUC Maximization with Deep Neural Networks INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
- [10] Communication-Efficient Distributed Stochastic AUC Maximization with Deep Neural Networks 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,