共 50 条
- [1] Pre-training Methods for Neural Machine Translation [J]. ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: TUTORIAL ABSTRACTS, 2021, : 21 - 25
- [3] On the Copying Behaviors of Pre-Training for Neural Machine Translation [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4265 - 4275
- [4] Curriculum pre-training for stylized neural machine translation [J]. APPLIED INTELLIGENCE, 2024, 54 (17-18) : 7958 - 7968
- [7] Joint Training for Neural Machine Translation Models with Monolingual Data [J]. THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 555 - 562
- [8] DEEP: DEnoising Entity Pre-training for Neural Machine Translation [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 1753 - 1766
- [9] On the Complementarity between Pre-Training and Back-Translation for Neural Machine Translation [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2900 - 2907
- [10] Modeling Concentrated Cross-Attention for Neural Machine Translation with Gaussian Mixture Model [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1401 - 1411