共 50 条
- [31] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602
- [36] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [37] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [38] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [39] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
- [40] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41