共 50 条
- [1] Enhancing Pre-trained Language Representation for Multi-Task Learning of Scientific Summarization [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
- [2] Multi-task Learning Based Online Dialogic Instruction Detection with Pre-trained Language Models [J]. ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2021), PT II, 2021, 12749 : 183 - 189
- [4] MCM: A Multi-task Pre-trained Customer Model for Personalization [J]. PROCEEDINGS OF THE 17TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2023, 2023, : 637 - 639
- [7] Drug knowledge discovery via multi-task learning and pre-trained models [J]. BMC Medical Informatics and Decision Making, 21
- [8] JiuZhang 2.0: A Unified Chinese Pre-trained Language Model for Multi-task Mathematical Problem Solving [J]. PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5660 - 5672
- [10] Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion [J]. arXiv, 2023,