共 50 条
- [2] On the Transferability of Pre-trained Language Models for Low-Resource Programming Languages 30TH IEEE/ACM INTERNATIONAL CONFERENCE ON PROGRAM COMPREHENSION (ICPC 2022), 2022, : 401 - 412
- [3] Investigating Pre-trained Audio Encoders in the Low-Resource Condition INTERSPEECH 2023, 2023, : 1498 - 1502
- [4] DIONYSUS: A Pre-trained Model for Low-Resource Dialogue Summarization PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 1368 - 1386
- [8] A Comparative Study of Pre-trained Encoders for Low-Resource Named Entity Recognition PROCEEDINGS OF THE 7TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2022, : 46 - 59
- [9] Efficient Fine-Tuning for Low-Resource Tibetan Pre-trained Language Models ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING-ICANN 2024, PT VII, 2024, 15022 : 410 - 422
- [10] ADAPTING PRE-TRAINED LANGUAGE MODELS TO LOW-RESOURCE TEXT SIMPLIFICATION: THE PATH MATTERS CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199