共 50 条
- [2] Extremely Low Resource Text simplification with Pre-trained Transformer Language Model [J]. PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 53 - 58
- [3] ADAPTING PRE-TRAINED LANGUAGE MODELS TO LOW-RESOURCE TEXT SIMPLIFICATION: THE PATH MATTERS [J]. CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
- [7] Question Answering based Clinical Text Structuring Using Pre-trained Language Model [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM), 2019, : 1596 - 1600
- [9] RoBERTuito: a pre-trained language model for social media text in Spanish [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 7235 - 7243
- [10] Non-Autoregressive Text Generation with Pre-trained Language Models [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 234 - 243