共 50 条
- [1] Legal judgment prediction based on pre-training model and knowledge distillation Kongzhi yu Juece/Control and Decision, 2021, 37 (01): : 67 - 76
- [2] MindLLM: Lightweight large language model pre-training, evaluation and domain application AI OPEN, 2024, 5 : 155 - 180
- [3] Knowledge distilled pre-training model for vision-language-navigation Applied Intelligence, 2023, 53 : 5607 - 5619
- [5] Self-Influence Guided Data Reweighting for Language Model Pre-training 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 2033 - 2045
- [8] Graph Structure Enhanced Pre-Training Language Model for Knowledge Graph Completion IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (04): : 2697 - 2708
- [10] Filtering, Distillation, and Hard Negatives for Vision-Language Pre-Training 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 6967 - 6977