共 50 条
- [22] Layer-by-Layer Knowledge Distillation for Training Simplified Bipolar Morphological Neural Networks Programming and Computer Software, 2023, 49 : S108 - S114
- [25] Knowledge distillation on neural networks for evolving graphs Social Network Analysis and Mining, 2021, 11
- [26] Online cross-layer knowledge distillation on graph neural networks with deep supervision Neural Computing and Applications, 2023, 35 : 22359 - 22374
- [27] Online cross-layer knowledge distillation on graph neural networks with deep supervision NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30): : 22359 - 22374
- [28] Stochastic Precision Ensemble: Self-Knowledge Distillation for Quantized Deep Neural Networks THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6794 - 6802
- [29] TAKDSR: Teacher Assistant Knowledge Distillation Framework for Graphics Image Super-Resolution IEEE ACCESS, 2023, 11 : 112015 - 112026
- [30] On Mixup Training: Improved Calibration and Predictive Uncertainty for Deep Neural Networks ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32