共 50 条
- [21] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures [J]. 2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4263 - 4272
- [22] Self-Attention and Dynamic Convolution Hybrid Model for Neural Machine Translation [J]. 11TH IEEE INTERNATIONAL CONFERENCE ON KNOWLEDGE GRAPH (ICKG 2020), 2020, : 352 - 359
- [23] Survey of Low-Resource Machine Translation [J]. COMPUTATIONAL LINGUISTICS, 2022, 48 (03) : 673 - 732
- [24] Adapting Attention-Based Neural Network to Low-Resource Mongolian-Chinese Machine Translation [J]. NATURAL LANGUAGE UNDERSTANDING AND INTELLIGENT APPLICATIONS (NLPCC 2016), 2016, 10102 : 470 - 480
- [25] Acoustic model training using self-attention for low-resource speech recognition [J]. JOURNAL OF THE ACOUSTICAL SOCIETY OF KOREA, 2020, 39 (05): : 483 - 489
- [27] Semantic Perception-Oriented Low-Resource Neural Machine Translation [J]. MACHINE TRANSLATION, CCMT 2021, 2021, 1464 : 51 - 62
- [30] Towards a Low-Resource Neural Machine Translation for Indigenous Languages in Canada [J]. TRAITEMENT AUTOMATIQUE DES LANGUES, 2021, 62 (03): : 39 - 63