Document-Level Neural Machine Translation With Recurrent Context States

被引:1
|
作者
Zhao, Yue [1 ]
Liu, Hui [2 ]
机构
[1] Northeastern Univ, Sch Marxism, Shenyang 110819, Peoples R China
[2] Northeastern Univ, Sch Comp Sci & Engn, Shenyang 110819, Peoples R China
关键词
Context modeling; Training; Complexity theory; Decoding; Computational modeling; Machine translation; Transformers; Neural machine translation; document-level translation; speeding up;
D O I
10.1109/ACCESS.2023.3247508
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Integrating contextual information into sentence-level neural machine translation (NMT) systems has been proven to be effective in generating fluent and coherent translations. However, taking too much context into account slows down these systems, especially when context-aware models are applied to the decoder side. To improve efficiency, we propose a simple and fast method to encode all sentences in an arbitrary large context window. It makes contextual representations in the process of translating each sentence so that the overhead introduced by the context model is almost negligible. We experiment with our method on three widely used English-German document-level translation datasets, which obtain substantial improvements over the sentence-level baseline with almost no loss in efficiency. Moreover, our method also achieves comparable performance with previous strong context-aware baselines and speeds up the inference by 1.53x. The speed-up is even larger when more contexts are taken into account. On the ContraPro pronoun translation dataset, it significantly outperforms the strong baseline.
引用
收藏
页码:27519 / 27526
页数:8
相关论文
共 50 条
  • [41] A Simple and Effective Unified Encoder for Document-Level Machine Translation
    Ma, Shuming
    Zhang, Dongdong
    Zhou, Ming
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3505 - 3511
  • [42] BLONDE: An Automatic Evaluation Metric for Document-level Machine Translation
    Jiang, Yuchen Eleanor
    Liu, Tianyu
    Ma, Shuming
    Zhang, Dongdong
    Yang, Jian
    Huang, Haoyang
    Sennrich, Rico
    Sachan, Mrinmaya
    Cotterell, Ryan
    Zhou, Ming
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 1550 - 1565
  • [43] Microsoft Translator at WMT 2019: Towards Large-Scale Document-Level Neural Machine Translation
    Junczys-Dowmunt, Marcin
    FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019), 2019, : 225 - 233
  • [44] Document Context Neural Machine Translation with Memory Networks
    Maruf, Sameen
    Haffari, Gholamreza
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1275 - 1284
  • [45] Has Machine Translation Achieved Human Parity? A Case for Document-level Evaluation
    Laeubli, Samuel
    Sennrich, Rico
    Volk, Martin
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4791 - 4796
  • [46] A Document-Level Machine Translation Quality Estimation Model Based on Centering Theory
    Chen, Yidong
    Zhong, Enjun
    Tong, Yiqi
    Qiu, Yanru
    Shi, Xiaodong
    MACHINE TRANSLATION, CCMT 2021, 2021, 1464 : 1 - 15
  • [47] Long-Short Term Masking Transformer: A Simple but Effective Baseline for Document-level Neural Machine Translation
    Zhang, Pei
    Chen, Boxing
    Ge, Niyu
    Fan, Kai
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1081 - 1087
  • [48] Document-Level Machine Translation Evaluation Metrics Enhanced with Simplified Lexical Chain
    Gong, Zhengxian
    Zhou, Guodong
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2015, 2015, 9362 : 396 - 403
  • [49] Investigating Contextual Influence in Document-Level Translation
    Nayak, Prashanth
    Haque, Rejwanul
    Kelleher, John D.
    Way, Andy
    INFORMATION, 2022, 13 (05)
  • [50] Enhancing Context Modeling with a Query-Guided Capsule Network for Document-level Translation
    Yang, Zhengxin
    Zhang, Jinchao
    Meng, Fandong
    Gu, Shuhao
    Feng, Yang
    Zhou, Jie
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1527 - 1537