Neural machine translation: past, present, and future

被引:0
|
作者
Shereen A. Mohamed
Ashraf A. Elsayed
Y. F. Hassan
Mohamed A. Abdou
机构
[1] Alexandria University,Department of Mathematics and Computer Science, Faculty of Science
[2] City of Scientific Research and Technological Applications,Informatics Research Institute
来源
关键词
Neural machine translation; Attention mechanism; Self-attentional transformer; Convolutional sequence to sequence;
D O I
暂无
中图分类号
学科分类号
摘要
Deep neural networks (DNN) have achieved great success in several research areas like information retrieval, image processing, and speech recognition. In the field of machine translation, neural machine translation (NMT) has been able to overcome the statistical machine translation (SMT), which has been the dominant technology for a long-term span of time. The recent machine translation approach, which consists of two sub networks named an encoder and a decoder, has gained state-of-the-art performance on different benchmarks and for several language pairs. The increasing interest of researchers in NMT is due to its simplicity compared to SMT which consists of several components tuned separately. This paper describes the evolution of NMT. The different attention mechanism architectures are discussed and the purpose of each. The paper also presents some toolkits that are developed specifically for research and production of NMT systems. The superiority of NMT over SMT is discussed, as well as the problems facing NMT.
引用
收藏
页码:15919 / 15931
页数:12
相关论文
共 50 条
  • [1] Neural machine translation: past, present, and future
    Mohamed, Shereen A.
    Elsayed, Ashraf A.
    Hassan, Y. F.
    Abdou, Mohamed A.
    [J]. NEURAL COMPUTING & APPLICATIONS, 2021, 33 (23): : 15919 - 15931
  • [2] Dynamic Past and Future for Neural Machine Translation
    Zheng, Zaixiang
    Huang, Shujian
    Tu, Zhaopeng
    Dai, Xin-Yu
    Chen, Jiajun
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 931 - 941
  • [3] MACHINE TRANSLATION, PAST, PRESENT AND FUTURE - HUTCHINS,WJ
    KING, M
    [J]. LINGUISTICS, 1987, 25 (06) : 1185 - 1187
  • [4] MACHINE TRANSLATION - PAST, PRESENT, FUTURE - HUTCHINS,WJ
    RUSSELL, RA
    [J]. MODERN LANGUAGE JOURNAL, 1987, 71 (04): : 437 - 438
  • [5] MACHINE TRANSLATION - PAST, PRESENT, FUTURE - HUTCHINS,WJ
    SAGER, JC
    [J]. JOURNAL OF DOCUMENTATION, 1987, 43 (02) : 177 - 180
  • [6] TRANSLATION - PAST, PRESENT, FUTURE
    BARTA, M
    [J]. META, 1995, 40 (01) : 169 - 171
  • [7] Biomarkers in translation; past, present and future
    Lock, Edward A.
    Bonventre, Joseph V.
    [J]. TOXICOLOGY, 2008, 245 (03) : 163 - 166
  • [8] Translation in Hong Kong: Past, present and future
    Erbaugh, MS
    [J]. JOURNAL OF THE AMERICAN ORIENTAL SOCIETY, 2002, 122 (04) : 852 - 853
  • [9] The past, present and future of neural prosthetics
    Taylor D.M.
    [J]. Nature Medicine, 2007, 13 (2) : 123 - 123
  • [10] The Past, Present, and Future of Machine Learning APIs
    Martin, Francisco
    [J]. ARTIFICIAL INTELLIGENCE RESEARCH AND DEVELOPMENT, 2015, 277 : 4 - 4