共 50 条
- [34] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [35] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [36] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [37] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
- [38] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [39] Let All Be Whitened: Multi-Teacher Distillation for Efficient Visual Retrieval THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4126 - 4135