Efficient Low-Resource Neural Machine Translation with Reread and Feedback Mechanism

被引:1
|
作者
Yu, Zhiqiang [1 ]
Yu, Zhengtao [1 ]
Guo, Junjun [1 ]
Huang, Yuxin [1 ]
Wen, Yonghua [1 ]
机构
[1] Kunming Univ Sci & Technol, Fac Informat Engn & Automat, Yunnan Key Lab Artificial Intelligence, 727 South Jingming Rd, Kunming 650500, Yunnan, Peoples R China
基金
中国国家自然科学基金;
关键词
Low-resource; neural machine translation; reread; feedback;
D O I
10.1145/3365244
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
How to utilize information sufficiently is a key problem in neural machine translation (NMT), which is effectively improved in rich-resource NMT by leveraging large-scale bilingual sentence pairs. However, for low-resource NMT, lack of bilingual sentence pairs results in poor translation performance; therefore, taking full advantage of global information in the encoding-decoding process is effective for low-resource NMT. In this article, we propose a novel reread-feedback NMT architecture (RFNMT) for using global information. Our architecture builds upon the improved sequence-to-sequence neural network and consists of a doubledeck attention-based encoder-decoder framework. In our proposed architecture, the information generated by the first-pass encoding and decoding process flows to the second-pass encoding process for more sufficient parameters initialization and information use. Specifically, we first propose a "reread" mechanism to transfer the outputs of the first-pass encoder to the second-pass encoder, and then the output is used for the initialization of the second-pass encoder. Second, we propose a "feedback" mechanism that transfers the first-pass decoder's outputs to a second-pass encoder via an important weight model and an improved gated recurrent unit (GRU). Experiments on multiple datasets show that our approach achieves significant improvements over state-of-the-art NMT systems, especially in low-resource settings.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Semantic Perception-Oriented Low-Resource Neural Machine Translation
    Wu, Nier
    Hou, Hongxu
    Li, Haoran
    Chang, Xin
    Jia, Xiaoning
    [J]. MACHINE TRANSLATION, CCMT 2021, 2021, 1464 : 51 - 62
  • [22] A Diverse Data Augmentation Strategy for Low-Resource Neural Machine Translation
    Li, Yu
    Li, Xiao
    Yang, Yating
    Dong, Rui
    [J]. INFORMATION, 2020, 11 (05)
  • [23] Rethinking the Exploitation of Monolingual Data for Low-Resource Neural Machine Translation
    Pang, Jianhui
    Yang, Baosong
    Wong, Derek Fai
    Wan, Yu
    Liu, Dayiheng
    Chao, Lidia Sam
    Xie, Jun
    [J]. COMPUTATIONAL LINGUISTICS, 2023, 50 (01) : 25 - 47
  • [24] Benchmarking Neural and Statistical Machine Translation on Low-Resource African Languages
    Duh, Kevin
    McNamee, Paul
    Post, Matt
    Thompson, Brian
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 2667 - 2675
  • [25] Towards a Low-Resource Neural Machine Translation for Indigenous Languages in Canada
    Ngoc Tan Le
    Sadat, Fatiha
    [J]. TRAITEMENT AUTOMATIQUE DES LANGUES, 2021, 62 (03): : 39 - 63
  • [26] Incremental Domain Adaptation for Neural Machine Translation in Low-Resource Settings
    Kalimuthu, Marimuthu
    Barz, Michael
    Sonntag, Daniel
    [J]. FOURTH ARABIC NATURAL LANGUAGE PROCESSING WORKSHOP (WANLP 2019), 2019, : 1 - 10
  • [27] An Analysis of Massively Multilingual Neural Machine Translation for Low-Resource Languages
    Mueller, Aaron
    Nicolai, Garrett
    McCarthy, Arya D.
    Lewis, Dylan
    Wu, Winston
    Yarowsky, David
    [J]. PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 3710 - 3718
  • [28] Regressing Word and Sentence Embeddings for Low-Resource Neural Machine Translation
    Unanue, Inigo Jauregi
    Borzeshi, Ehsan Zare
    Piccardi, Massimo
    [J]. IEEE Transactions on Artificial Intelligence, 2023, 4 (03): : 450 - 463
  • [29] Neural machine translation for low-resource languages without parallel corpora
    Karakanta, Alina
    Dehdari, Jon
    van Genabith, Josef
    [J]. MACHINE TRANSLATION, 2018, 32 (1-2) : 167 - 189
  • [30] Hierarchical Transfer Learning Architecture for Low-Resource Neural Machine Translation
    Luo, Gongxu
    Yang, Yating
    Yuan, Yang
    Chen, Zhanheng
    Ainiwaer, Aizimaiti
    [J]. IEEE ACCESS, 2019, 7 : 154157 - 154166