共 50 条
- [2] μBERT: Mutation Testing using Pre-Trained Language Models [J]. 2022 IEEE 15TH INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION WORKSHOPS (ICSTW 2022), 2022, : 160 - 169
- [3] Devulgarization of Polish Texts Using Pre-trained Language Models [J]. COMPUTATIONAL SCIENCE, ICCS 2022, PT II, 2022, : 49 - 55
- [4] MERGEDISTILL: Merging Pre-trained Language Models using Distillation [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2874 - 2887
- [5] Issue Report Classification Using Pre-trained Language Models [J]. 2022 IEEE/ACM 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE-BASED SOFTWARE ENGINEERING (NLBSE 2022), 2022, : 29 - 32
- [6] Automated Assessment of Inferences Using Pre-Trained Language Models [J]. APPLIED SCIENCES-BASEL, 2024, 14 (09):
- [7] Annotating Columns with Pre-trained Language Models [J]. PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1493 - 1503
- [8] LaoPLM: Pre-trained Language Models for Lao [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6506 - 6512
- [9] PhoBERT: Pre-trained language models for Vietnamese [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1037 - 1042
- [10] HinPLMs: Pre-trained Language Models for Hindi [J]. 2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, : 241 - 246