Syntax-Aware Data Augmentation for Neural Machine Translation

被引:4
|
作者
Duan, Sufeng [1 ,2 ,3 ]
Zhao, Hai [1 ,2 ,3 ]
Zhang, Dongdong [4 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Key Lab Shanghai Educ Commiss Intelligent Interact, Shanghai 200240, Peoples R China
[3] Shanghai Jiao Tong Univ, AI Inst, MoE Key Lab Artificial Intelligence, Shanghai 200240, Peoples R China
[4] Microsoft Res Asia, Beijing 100080, Peoples R China
基金
中国国家自然科学基金;
关键词
Index Terms-Natural language processing; neural machine translation; data augmentation; dependency parsing;
D O I
10.1109/TASLP.2023.3301214
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Data augmentation is an effective method for the performance enhancement of neural machine translation (NMT) by generating additional bilingual data. In this article, we propose a novel data augmentation strategy for neural machine translation. Unlike existing data augmentation methods that simply modify words with the same probability across different sentences, we introduce a sentence-specific probability approach for word selection based on the syntactic roles of words in the sentence. Our motivation is to consider a linguistics-motivated method to obtain more ingenious language generation rather than relying on computation-motivated approaches only. We argue that high-quality aligned bilingual data is crucial for NMT, and only computation-motivated data augmentation is insufficient to provide good enough extra enhancement data. Our approach leverages dependency parse trees of input sentences to determine the selection probability of each word in the sentence using three different functions to calculate probabilities for words with different depths. Besides, our method also revises the probability for words considering the sentence length. We evaluate our methods on multiple translation tasks. The experimental results demonstrate that our proposed data augmentation method does effectively boost existing sentence-independent methods for significant improvement of performance on translation tasks. Furthermore, an ablation study shows that our method does select fewer essential words and preserves the syntactic structure.
引用
下载
收藏
页码:2988 / 2999
页数:12
相关论文
共 50 条
  • [1] Syntax-aware Transformer Encoder for Neural Machine Translation
    Duan, Sufeng
    Zhao, Hai
    Zhou, Junru
    Wang, Rui
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 396 - 401
  • [2] Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder
    Chen, Huadong
    Huang, Shujian
    Chiang, David
    Chen, Jiajun
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1936 - 1945
  • [3] Recurrent graph encoder for syntax-aware neural machine translation
    Liang Ding
    Longyue Wang
    Siyou Liu
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1053 - 1062
  • [4] Recurrent graph encoder for syntax-aware neural machine translation
    Ding, Liang
    Wang, Longyue
    Liu, Siyou
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (04) : 1053 - 1062
  • [5] Syntax-Aware Complex-Valued Neural Machine Translation
    Liu, Yang
    Hou, Yuexian
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT V, 2023, 14258 : 474 - 485
  • [6] Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations
    Zhang, Meishan
    Li, Zhenghua
    Fu, Guohong
    Zhang, Min
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 1151 - 1161
  • [7] Syntax-aware neural machine translation directed by syntactic dependency degree
    Ru Peng
    Tianyong Hao
    Yi Fang
    Neural Computing and Applications, 2021, 33 : 16609 - 16625
  • [8] Syntax-aware neural machine translation directed by syntactic dependency degree
    Peng, Ru
    Hao, Tianyong
    Fang, Yi
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 16609 - 16625
  • [9] Syntax-Aware Neural Semantic Role Labeling
    Xia, Qingrong
    Li, Zhenghua
    Zhang, Min
    Zhang, Meishan
    Fu, Guohong
    Wang, Rui
    Si, Luo
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 7305 - 7313
  • [10] Syntax-aware entity representations for neural relation extraction
    He, Zhengqiu
    Chen, Wenliang
    Li, Zhenghua
    Zhang, Wei
    Shao, Hao
    Zhang, Min
    ARTIFICIAL INTELLIGENCE, 2019, 275 : 602 - 617