共 36 条
- [3] Self-Distillation and Pinyin Character Prediction for Chinese Spelling Correction Based on Multimodality [J]. APPLIED SCIENCES-BASEL, 2024, 14 (04):
- [5] Mitigating Membership Inference Attacks by Self-Distillation Through a Novel Ensemble Architecture [J]. PROCEEDINGS OF THE 31ST USENIX SECURITY SYMPOSIUM, 2022, : 1433 - 1450
- [7] Reminding the incremental language model via data-free self-distillation [J]. Applied Intelligence, 2023, 53 : 9298 - 9320
- [9] General Cross-Architecture Distillation of Pretrained Language Models into Matrix Embeddings [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
- [10] MaskCLIP: Masked Self-Distillation Advances Contrastive Language-Image Pretraining [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 10995 - 11005