共 50 条
- [42] Neural Machine Translation Models with Attention-Based Dropout Layer [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (02): : 2981 - 3009
- [43] Neural Machine Translation with Attention Based on a New Syntactic Branch Distance [J]. MACHINE TRANSLATION, CCMT 2019, 2019, 1104 : 47 - 57
- [44] Losing Heads in the Lottery: Pruning Transformer Attention in Neural Machine Translation [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 2664 - 2674
- [45] Attention Focusing for Neural Machine Translation by Bridging Source and Target Embeddings [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1767 - 1776
- [46] Multi-Granularity Self-Attention for Neural Machine Translation [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 887 - 897
- [47] An Effective Coverage Approach for Attention-based Neural Machine Translation [J]. PROCEEDINGS OF 2019 6TH NATIONAL FOUNDATION FOR SCIENCE AND TECHNOLOGY DEVELOPMENT (NAFOSTED) CONFERENCE ON INFORMATION AND COMPUTER SCIENCE (NICS), 2019, : 240 - 245
- [48] History Attention for Source-Target Alignment in Neural Machine Translation [J]. PROCEEDINGS OF 2018 TENTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2018, : 619 - 624
- [50] Cross Aggregation of Multi-head Attention for Neural Machine Translation [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 380 - 392