共 50 条
- [31] BERTweet: A pre-trained language model for English Tweets [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING: SYSTEM DEMONSTRATIONS, 2020, : 9 - 14
- [32] Pre-trained Language Model for Biomedical Question Answering [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2019, PT II, 2020, 1168 : 727 - 740
- [33] Are Pre-trained Language Models Useful for Model Ensemble in Chinese Grammatical Error Correction? [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 893 - 901
- [34] CANCN-BERT: A Joint Pre-Trained Language Model for Classical and Modern Chinese [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3112 - 3116
- [35] Misspelling Correction with Pre-trained Contextual Language Model [J]. PROCEEDINGS OF 2020 IEEE 19TH INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS & COGNITIVE COMPUTING (ICCI*CC 2020), 2020, : 144 - 149
- [36] Traditional Chinese Medicine Symptom Normalization Approach Based on Pre-Trained Language Models [J]. Beijing Youdian Daxue Xuebao/Journal of Beijing University of Posts and Telecommunications, 2022, 45 (04): : 13 - 18
- [39] A Pre-trained Language Model for Medical Question Answering Based on Domain Adaption [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 216 - 227
- [40] NMT Enhancement based on Knowledge Graph Mining with Pre-trained Language Model [J]. 2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 185 - 189