共 50 条
- [1] Chinese-English Machine Translation Model Based On Transfer Learning And Self-attention [J]. JOURNAL OF APPLIED SCIENCE AND ENGINEERING, 2024, 27 (08): : 3011 - 3019
- [2] Re-Transformer: A Self-Attention Based Model for Machine Translation [J]. AI IN COMPUTATIONAL LINGUISTICS, 2021, 189 : 3 - 10
- [3] Self-Attention and Dynamic Convolution Hybrid Model for Neural Machine Translation [J]. 11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 352 - 359
- [4] X-Transformer: A Machine Translation Model Enhanced by the Self-Attention Mechanism [J]. APPLIED SCIENCES-BASEL, 2022, 12 (09):
- [5] Fixed Encoder Self-Attention Patterns in Transformer-Based Machine Translation [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 556 - 568
- [6] Enhancing Machine Translation with Dependency-Aware Self-Attention [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 1618 - 1627
- [7] Multi-Granularity Self-Attention for Neural Machine Translation [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 887 - 897
- [9] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4263 - 4272