Selective Attention for Context-aware Neural Machine Translation

被引:0
|
作者
Maruf, Sameen [1 ]
Martins, Andre F. T. [2 ]
Haffari, Gholamreza [1 ]
机构
[1] Monash Univ, Fac Informat Technol, Melbourne, Vic, Australia
[2] Unbabel Inst Telecomun, Lisbon, Portugal
基金
欧洲研究理事会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the progress made in sentence-level NMT, current systems still fall short at achieving fluent, good quality translation for a full document. Recent works in context-aware NMT consider only a few previous sentences as context and may not scale to entire documents. To this end, we propose a novel and scalable top-down approach to hierarchical attention for context-aware NMT which uses sparse attention to selectively focus on relevant sentences in the document context and then attends to key words in those sentences. We also propose single-level attention approaches based on sentence or word-level information in the context. The document-level context representation, produced from these attention modules, is integrated into the encoder or decoder of the Transformer model depending on whether we use monolingual or bilingual context. Our experiments and evaluation on English-German datasets in different document MT settings show that our selective attention approach not only significantly outperforms context-agnostic baselines but also surpasses context-aware baselines in most cases.
引用
收藏
页码:3092 / 3102
页数:11
相关论文
共 50 条
  • [1] A study of BERT for context-aware neural machine translation
    Xueqing Wu
    Yingce Xia
    Jinhua Zhu
    Lijun Wu
    Shufang Xie
    Tao Qin
    [J]. Machine Learning, 2022, 111 : 917 - 935
  • [2] Context-Aware Monolingual Repair for Neural Machine Translation
    Voita, Elena
    Sennrich, Rico
    Titov, Ivan
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 877 - 886
  • [3] A Context-Aware Recurrent Encoder for Neural Machine Translation
    Zhang, Biao
    Xiong, Deyi
    Su, Jinsong
    Duan, Hong
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2017, 25 (12) : 2424 - 2432
  • [4] A study of BERT for context-aware neural machine translation
    Wu, Xueqing
    Xia, Yingce
    Zhu, Jinhua
    Wu, Lijun
    Xie, Shufang
    Qin, Tao
    [J]. MACHINE LEARNING, 2022, 111 (03) : 917 - 935
  • [5] Context-Aware Neural Machine Translation Learns Anaphora Resolution
    Voita, Elena
    Serdyukov, Pavel
    Sennrich, Rico
    Titov, Ivan
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1264 - 1274
  • [6] Context-Aware Neural Machine Translation for Korean Honorific Expressions
    Hwang, Yongkeun
    Kim, Yanghoon
    Jung, Kyomin
    [J]. ELECTRONICS, 2021, 10 (13)
  • [7] One Type Context Is Not Enough: Global Context-aware Neural Machine Translation
    Chen, Linqing
    Li, Junhui
    Gong, Zhengxian
    Zhang, Min
    Zhou, Guodong
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (06)
  • [8] Context-Aware Linguistic Steganography Model Based on Neural Machine Translation
    Ding, Changhao
    Fu, Zhangjie
    Yang, Zhongliang
    Yu, Qi
    Li, Daqiu
    Huang, Yongfeng
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 868 - 878
  • [9] Context-aware Neural Machine Translation with Mini-batch Embedding
    Morishita, Makoto
    Suzuki, Jun
    Iwata, Tomoharu
    Nagata, Masaaki
    [J]. 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2513 - 2521
  • [10] Context-Aware Machine Translation with Source Coreference Explanation
    Vu, Huy Hien
    Kamigaito, Hidetaka
    Watanabe, Taro
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 856 - 874