共 50 条
- [42] A General Knowledge Distillation Framework for Counterfactual Recommendation via Uniform Data PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 831 - 840
- [43] MTMS: Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 1995 - 2005
- [45] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [46] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [47] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [49] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432