共 50 条
- [2] Hierarchical Transfer Learning Architecture for Low-Resource Neural Machine Translation [J]. IEEE ACCESS, 2019, 7 : 154157 - 154166
- [3] Reinforced Curriculum Learning on Pre-Trained Neural Machine Translation Models [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9652 - 9659
- [5] Meta-Learning for Low-Resource Neural Machine Translation [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3622 - 3631
- [6] DIONYSUS: A Pre-trained Model for Low-Resource Dialogue Summarization [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 1368 - 1386
- [7] Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation? [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 58 - 67
- [9] Deep Fusing Pre-trained Models into Neural Machine Translation [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11468 - 11476