Syntax-aware Transformer Encoder for Neural Machine Translation

被引:0
|
作者
Duan, Sufeng [1 ]
Zhao, Hai [1 ]
Zhou, Junru [1 ]
Wang, Rui [2 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Key Lab Shanghai Educ Commiss Intelligent Interac, MoE Key Lab Artificial Intelligence,AI Inst, Shanghai, Peoples R China
[2] Natl Inst Informat & Commun Technol NICT, Kyoto, Japan
基金
中国国家自然科学基金;
关键词
Neural Machine Translation; dependency parsing; POS Tagging;
D O I
10.1109/ialp48816.2019.9037672
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Syntax has been shown a helpful clue in various natural language processing tasks including previous statistical machine translation and recurrent neural network based machine translation. However, since the state-of-the-art neural machine translation (NMT) has to be built on the Transformer based encoder, few attempts are found on such a syntax enhancement. Thus in this paper, we explore effective ways to introduce syntax into Transformer for better machine translation. We empirically compare two ways, positional encoding and input embedding, to exploit syntactic clues from dependency tree over source sentence. Our proposed methods have a merit keeping the architecture of Transformer unchanged, thus the efficiency of Transformer can be kept. The experimental results on IWSLT' 14 German-to-English and WMT14 English-to-German show that our method can yield advanced results over strong Transformer baselines.
引用
收藏
页码:396 / 401
页数:6
相关论文
共 50 条
  • [1] Improved Neural Machine Translation with a Syntax-Aware Encoder and Decoder
    Chen, Huadong
    Huang, Shujian
    Chiang, David
    Chen, Jiajun
    PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 1936 - 1945
  • [2] Recurrent graph encoder for syntax-aware neural machine translation
    Liang Ding
    Longyue Wang
    Siyou Liu
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1053 - 1062
  • [3] Recurrent graph encoder for syntax-aware neural machine translation
    Ding, Liang
    Wang, Longyue
    Liu, Siyou
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (04) : 1053 - 1062
  • [4] Syntax-Aware Data Augmentation for Neural Machine Translation
    Duan, Sufeng
    Zhao, Hai
    Zhang, Dongdong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 2988 - 2999
  • [5] Syntax-Aware Complex-Valued Neural Machine Translation
    Liu, Yang
    Hou, Yuexian
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT V, 2023, 14258 : 474 - 485
  • [6] Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations
    Zhang, Meishan
    Li, Zhenghua
    Fu, Guohong
    Zhang, Min
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 1151 - 1161
  • [7] Syntax-aware neural machine translation directed by syntactic dependency degree
    Ru Peng
    Tianyong Hao
    Yi Fang
    Neural Computing and Applications, 2021, 33 : 16609 - 16625
  • [8] Syntax-aware neural machine translation directed by syntactic dependency degree
    Peng, Ru
    Hao, Tianyong
    Fang, Yi
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 16609 - 16625
  • [9] A Syntax-Aware Encoder for Authorship Attribution
    Liu, Jianbo
    Hu, Zhiqiang
    Zhang, Jiasheng
    Lee, Roy Ka-Wei
    Shao, Jie
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2021, PT I, 2021, 13080 : 403 - 411
  • [10] Syntax-Aware Neural Semantic Role Labeling
    Xia, Qingrong
    Li, Zhenghua
    Zhang, Min
    Zhang, Meishan
    Fu, Guohong
    Wang, Rui
    Si, Luo
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 7305 - 7313