共 50 条
- [2] Emotional Paraphrasing Using Pre-trained Language Models [J]. 2021 9TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2021,
- [3] BERT-MK: Integrating Graph Contextualized Knowledge into Pre-trained Language Models [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 2281 - 2290
- [7] Devulgarization of Polish Texts Using Pre-trained Language Models [J]. COMPUTATIONAL SCIENCE, ICCS 2022, PT II, 2022, : 49 - 55
- [8] MERGEDISTILL: Merging Pre-trained Language Models using Distillation [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2874 - 2887
- [9] Issue Report Classification Using Pre-trained Language Models [J]. 2022 IEEE/ACM 1ST INTERNATIONAL WORKSHOP ON NATURAL LANGUAGE-BASED SOFTWARE ENGINEERING (NLBSE 2022), 2022, : 29 - 32
- [10] Automated Assessment of Inferences Using Pre-Trained Language Models [J]. APPLIED SCIENCES-BASEL, 2024, 14 (09):