共 50 条
- [31] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319
- [32] Named Entity Recognition Method Based on Multi-Teacher Collaborative Cyclical Knowledge Distillation PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 230 - 235
- [33] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
- [35] Continual Learning with Confidence-based Multi-teacher Knowledge Distillation for Neural Machine Translation 2024 6TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING, ICNLP 2024, 2024, : 336 - 343
- [37] UNIC: Universal Classification Models via Multi-teacher Distillation COMPUTER VISION-ECCV 2024, PT IV, 2025, 15062 : 353 - 371
- [38] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602