RESA: Relation Enhanced Self-Attention for Low-Resource Neural Machine Translation

被引:2
|
作者
Wu, Xing [1 ]
Shi, Shumin [1 ]
Huang, Heyan [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci & Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Low-Resource Neural Machine Translation; Dependency Syntax; Self-Attention;
D O I
10.1109/IALP54817.2021.9675172
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Transformer-based Neural Machine Translation models have achieved impressive results on many translation tasks. In the meanwhile, some studies prove that extending syntax information can be explicitly incorporated to provide further improvements especially for some low-resource languages. In this paper, we propose RESA: the relation enhanced self-attention for Transformer which can integrate source side dependency syntax. More specifically, dependency parsing produces two kinds of information: dependency heads and relation labels, compared to the previous works only pay attention to dependency heads information, RESA use two methods to integrate relation labels as well: 1) Hard-way that uses a hyper parameter to control the information percentage after mapping relation labels sequence to continuous representations; 2) Gate-way that employs a gate mechanism to mix word information and relation labels information. We evaluate our methods on low-resource Chinese-Tibetan and Chinese-Mongol translation tasks, and the preliminary experimental results show that the proposed model achieves 0.93 and 0.68 BLEU scores gain compared to the baseline model.
引用
收藏
页码:159 / 164
页数:6
相关论文
共 50 条
  • [1] Enhancing low-resource neural machine translation with syntax-graph guided self-attention
    Gong, Longchao
    Li, Yan
    Guo, Junjun
    Yu, Zhengtao
    Gao, Shengxiang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 246
  • [2] A Survey on Low-Resource Neural Machine Translation
    Wang, Rui
    Tan, Xu
    Luo, Renqian
    Qin, Tao
    Liu, Tie-Yan
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 4636 - 4643
  • [3] Transformers for Low-resource Neural Machine Translation
    Gezmu, Andargachew Mekonnen
    Nuernberger, Andreas
    [J]. ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 1, 2022, : 459 - 466
  • [4] A Survey on Low-resource Neural Machine Translation
    Li, Hong-Zheng
    Feng, Chong
    Huang, He-Yan
    [J]. Zidonghua Xuebao/Acta Automatica Sinica, 2021, 47 (06): : 1217 - 1231
  • [5] DRA: dynamic routing attention for neural machine translation with low-resource languages
    Wang, Zhenhan
    Song, Ran
    Yu, Zhengtao
    Mao, Cunli
    Gao, Shengxiang
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024,
  • [6] Low-Resource Neural Machine Translation with Neural Episodic Control
    Wu, Nier
    Hou, Hongxu
    Sun, Shuo
    Zheng, Wei
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [7] Low-resource Neural Machine Translation: Methods and Trends
    Shi, Shumin
    Wu, Xing
    Su, Rihai
    Huang, Heyan
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2022, 21 (05)
  • [8] Recent advances of low-resource neural machine translation
    Haque, Rejwanul
    Liu, Chao-Hong
    Way, Andy
    [J]. MACHINE TRANSLATION, 2021, 35 (04) : 451 - 474
  • [9] Neural Machine Translation for Low-resource Languages: A Survey
    Ranathunga, Surangika
    Lee, En-Shiun Annie
    Skenduli, Marjana Prifti
    Shekhar, Ravi
    Alam, Mehreen
    Kaur, Rishemjit
    [J]. ACM COMPUTING SURVEYS, 2023, 55 (11)
  • [10] Data Augmentation for Low-Resource Neural Machine Translation
    Fadaee, Marzieh
    Bisazza, Arianna
    Monz, Christof
    [J]. PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 2, 2017, : 567 - 573