Low-Resource Neural Machine Translation with Neural Episodic Control

被引:0
|
作者
Wu, Nier [1 ]
Hou, Hongxu [1 ]
Sun, Shuo [1 ]
Zheng, Wei [1 ]
机构
[1] Inner Mongolia Univ, Coll Comp Sci, Coll Software, Hohhot, Peoples R China
关键词
Reinforcement Learning; Machine Translation; DND; Episodic Control; Low-resource;
D O I
10.1109/IJCNN52387.2021.9533677
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reinforcement Learning (RL) has been proved to alleviate metric inconsistency and exposure deviation in training-evaluation of neural machine translation (NMT), but the sample efficiency is limited by sampling methods (Temporal-Difference (TD) or Monte-Carlo (MC)), and still cannot compensate for the inefficient non-zero rewards caused by insufficient data sets. In addition, RL rewards can only be effective when the model parameters are basically determined. Therefore, we proposed episodic control reinforcement learning method, which obtains the model with basically determined parameters through the knowledge transfer, and records the historical action trajectory by introducing semi-tabular differentiable neural dictionary (DND), the model can quickly approximate the real state-value according to samples reward when updating policy. We verified on CCMT2019 Mongolian-Chinese (Mo-Zh), Tibetan-Chinese (Ti-Zh), and Uyghur-Chinese (Ug-Zh) tasks, and the results showed that the quality was significantly improved, which fully demonstrated the effectiveness of the method.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] A Survey on Low-Resource Neural Machine Translation
    Wang, Rui
    Tan, Xu
    Luo, Renqian
    Qin, Tao
    Liu, Tie-Yan
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 4636 - 4643
  • [2] A Survey on Low-resource Neural Machine Translation
    Li, Hong-Zheng
    Feng, Chong
    Huang, He-Yan
    [J]. Zidonghua Xuebao/Acta Automatica Sinica, 2021, 47 (06): : 1217 - 1231
  • [3] Transformers for Low-resource Neural Machine Translation
    Gezmu, Andargachew Mekonnen
    Nuernberger, Andreas
    [J]. ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 1, 2022, : 459 - 466
  • [4] Low-resource Neural Machine Translation: Methods and Trends
    Shi, Shumin
    Wu, Xing
    Su, Rihai
    Huang, Heyan
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (05)
  • [5] Recent advances of low-resource neural machine translation
    Haque, Rejwanul
    Liu, Chao-Hong
    Way, Andy
    [J]. MACHINE TRANSLATION, 2021, 35 (04) : 451 - 474
  • [6] Neural Machine Translation for Low-resource Languages: A Survey
    Ranathunga, Surangika
    Lee, En-Shiun Annie
    Skenduli, Marjana Prifti
    Shekhar, Ravi
    Alam, Mehreen
    Kaur, Rishemjit
    [J]. ACM COMPUTING SURVEYS, 2023, 55 (11)
  • [7] Data Augmentation for Low-Resource Neural Machine Translation
    Fadaee, Marzieh
    Bisazza, Arianna
    Monz, Christof
    [J]. PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 2, 2017, : 567 - 573
  • [8] A Strategy for Referential Problem in Low-Resource Neural Machine Translation
    Ji, Yatu
    Shi, Lei
    Su, Yila
    Ren, Qing-dao-er-ji
    Wu, Nier
    Wang, Hongbin
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 321 - 332
  • [9] Machine Translation in Low-Resource Languages by an Adversarial Neural Network
    Sun, Mengtao
    Wang, Hao
    Pasquine, Mark
    Hameed, Ibrahim A.
    [J]. APPLIED SCIENCES-BASEL, 2021, 11 (22):
  • [10] Low-Resource Neural Machine Translation: A Systematic Literature Review
    Yazar, Bilge Kagan
    Sahin, Durmus Ozkan
    Kilic, Erdal
    [J]. IEEE ACCESS, 2023, 11 : 131775 - 131813