共 50 条
- [32] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
- [33] Model Compression with Two-stage Multi-teacher Knowledge Distillation for Web Question Answering System [J]. PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 690 - 698
- [34] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement [J]. APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [36] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [37] An architecture for medical knowledge-based assistance systems [J]. IEEE SYMPOSIUM AND WORKSHOP ON ENGINEERING OF COMPUTER-BASED SYSTEMS, PROCEEDINGS, 1996, : 442 - 449
- [38] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation [J]. PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):
- [39] Localized knowledge based intelligent medical systems [J]. CBMS 2003: 16TH IEEE SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS, PROCEEDINGS, 2003, : 89 - 96