共 50 条
- [41] MTMS: Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension [J]. PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 1995 - 2005
- [43] Improving Bi-encoder Document Ranking Models with Two Rankers and Multi-teacher Distillation [J]. SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2192 - 2196
- [45] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System [J]. PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [47] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement [J]. APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [48] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [49] Unsupervised Domain Adaptation in Medical Image Segmentation via Fourier Feature Decoupling and Multi-teacher Distillation [J]. ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT VI, ICIC 2024, 2024, 14867 : 98 - 110
- [50] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation [J]. PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):