共 50 条
- [2] Low-Resource Neural Machine Translation Using XLNet Pre-training Model ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 503 - 514
- [3] Character-Aware Low-Resource Neural Machine Translation with Weight Sharing and Pre-training CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 321 - 333
- [4] Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2706 - 2718
- [5] Rethinking Data Augmentation for Low-Resource Neural Machine Translation: A Multi-Task Learning Approach 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 8502 - 8516
- [6] Adaptive Knowledge Sharing in Multi-Task Learning: Improving Low-Resource Neural Machine Translation PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 656 - 661
- [7] Does Masked Language Model Pre-training with Artificial Data Improve Low-resource Neural Machine Translation? 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2216 - 2225
- [10] Evaluating Pre-training Objectives for Low-Resource Translation into Morphologically Rich Languages LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 4933 - 4943