共 50 条
- [1] Inplace knowledge distillation with teacher assistant for improved training of flexible deep neural networks [J]. 29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1356 - 1360
- [4] TAKDSR: Teacher Assistant Knowledge Distillation Framework for Graphics Image Super-Resolution [J]. IEEE ACCESS, 2023, 11 : 112015 - 112026
- [7] Data-Efficient Knowledge Distillation with Teacher Assistant-Based Dynamic Objective Alignment [J]. COMPUTATIONAL SCIENCE, ICCS 2024, PT I, 2024, 14832 : 181 - 195
- [9] Improving knowledge distillation via pseudo-multi-teacher network [J]. Machine Vision and Applications, 2023, 34