共 50 条
- [22] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319
- [25] Reinforced Multi-teacher Knowledge Distillation for Unsupervised Sentence Representation ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 320 - 332
- [26] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [27] MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution COMPUTER VISION - ECCV 2024, PT XXXIX, 2025, 15097 : 364 - 382
- [28] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
- [29] Multi-task learning for collaborative filtering International Journal of Machine Learning and Cybernetics, 2022, 13 : 1355 - 1368
- [30] Neural multi-task collaborative filtering EVOLUTIONARY INTELLIGENCE, 2022, 15 (04) : 2385 - 2393