共 50 条
- [2] Character-Aware Low-Resource Neural Machine Translation with Weight Sharing and Pre-training [J]. CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 321 - 333
- [3] Does Masked Language Model Pre-training with Artificial Data Improve Low-resource Neural Machine Translation? [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2216 - 2225
- [4] Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2706 - 2718
- [7] Evaluating Pre-training Objectives for Low-Resource Translation into Morphologically Rich Languages [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 4933 - 4943
- [8] Pre-training Methods for Neural Machine Translation [J]. ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: TUTORIAL ABSTRACTS, 2021, : 21 - 25
- [9] Low-Resource Named Entity Recognition via the Pre-Training Model [J]. SYMMETRY-BASEL, 2021, 13 (05):
- [10] Language Model Prior for Low-Resource Neural Machine Translation [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7622 - 7634