共 50 条
- [41] Training Deeper Neural Machine Translation Models with Transparent Attention [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 3028 - 3033
- [42] Recursive Annotations for Attention-Based Neural Machine Translation [J]. 2017 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2017, : 164 - 167
- [43] Fine-grained attention mechanism for neural machine translation [J]. NEUROCOMPUTING, 2018, 284 : 171 - 176
- [44] Training with Adversaries to Improve Faithfulness of Attention in Neural Machine Translation [J]. AACL-IJCNLP 2020: THE 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2020, : 86 - 93
- [45] Towards Understanding Neural Machine Translation with Attention Heads' Importance [J]. APPLIED SCIENCES-BASEL, 2024, 14 (07):
- [46] Syntax-Based Attention Masking for Neural Machine Translation [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 47 - 52
- [47] Look-Ahead Attention for Generation in Neural Machine Translation [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2017, 2018, 10619 : 211 - 223
- [48] Selective Attention for Context-aware Neural Machine Translation [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 3092 - 3102