共 50 条
- [1] Revisiting Pre-trained Models for Chinese Natural Language Processing [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 657 - 668
- [2] Meta Distant Transfer Learning for Pre-trained Language Models [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9742 - 9752
- [3] A Study of Pre-trained Language Models in Natural Language Processing [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [4] How Linguistically Fair Are Multilingual Pre-Trained Language Models? [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 12710 - 12718
- [5] Multilingual Translation via Grafting Pre-trained Language Models [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2735 - 2747
- [6] Error Investigation of Pre-trained BERTology Models on Vietnamese Natural Language Inference [J]. RECENT CHALLENGES IN INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, 2022, 1716 : 176 - 188
- [7] On the Language Neutrality of Pre-trained Multilingual Representations [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1663 - 1674
- [8] Investigating Prompt Learning for Chinese Few-Shot Text Classification with Pre-Trained Language Models [J]. APPLIED SCIENCES-BASEL, 2022, 12 (21):