共 50 条
- [1] SsciBERT: a pre-trained language model for social science texts [J]. SCIENTOMETRICS, 2023, 128 (02) : 1241 - 1263
- [3] Devulgarization of Polish Texts Using Pre-trained Language Models [J]. COMPUTATIONAL SCIENCE, ICCS 2022, PT II, 2022, : 49 - 55
- [4] RoBERTuito: a pre-trained language model for social media text in Spanish [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 7235 - 7243
- [5] Pre-trained Language Model Representations for Language Generation [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4052 - 4059
- [6] Adder Encoder for Pre-trained Language Model [J]. CHINESE COMPUTATIONAL LINGUISTICS, CCL 2023, 2023, 14232 : 339 - 347
- [9] ViDeBERTa: A powerful pre-trained language model for Vietnamese [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1071 - 1078