共 50 条
- [1] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature [J]. WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
- [2] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation [J]. PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):
- [4] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation [J]. COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602
- [5] Semi-supervised teacher-student architecture for relation extraction [J]. NLP@NAACL-HLT 2019 - 3rd Workshop on Structured Prediction for NLP, Proceedings, 2021, : 29 - 37
- [6] Multi-teacher Self-training for Semi-supervised Node Classification with Noisy Labels [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2946 - 2954
- [8] Biographical Semi-Supervised Relation Extraction Dataset [J]. PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 3121 - 3130
- [10] Exploit a Multi-head Reference Graph for Semi-supervised Relation Extraction [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,