Arabic Machine Transliteration using an Attention-based Encoder-decoder Model

被引:17
|
作者
Ameur, Mohamed Seghir Hadj [1 ]
Meziane, Farid [2 ]
Guessoum, Ahmed [1 ]
机构
[1] USTHB Univ, TALAA Grp, Algiers, Algeria
[2] Univ Salford, Informat Res Ctr, Salford M5 4WT, Lancs, England
关键词
Natural Language Processing; Arabic Language; Arabic Transliteration; Deep Learning; Sequence-to-sequence Models; Encoder-decoder Architecture; Recurrent Neural Networks;
D O I
10.1016/j.procs.2017.10.120
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Transliteration is the process of converting words from a given source language alphabet to a target language alphabet, in a way that best preserves the phonetic and orthographic aspects of the transliterated words. Even though an important effort has been made towards improving this process for many languages such as English, French and Chinese, little research work has been accomplished with regard to the Arabic language. In this work, an attention-based encoder-decoder system is proposed for the task of Machine Transliteration between the Arabic and English languages. Our experiments proved the efficiency of our proposal approach in comparison to some previous research developed in this area. (C) 2017 The Authors. Published by Elsevier B.V.
引用
收藏
页码:287 / 297
页数:11
相关论文
共 50 条
  • [21] Code generation from a graphical user interface via attention-based encoder-decoder model
    Chen, Wen Yin
    Podstreleny, Pavol
    Cheng, Wen-Huang
    Chen, Yung-Yao
    Hua, Kai-Lung
    MULTIMEDIA SYSTEMS, 2022, 28 (01) : 121 - 130
  • [22] Enhancing lane changing trajectory prediction on highways: A heuristic attention-based encoder-decoder model
    Xiao, Xue
    Bo, Peng
    Chen, Yingda
    Chen, Yili
    Li, Keping
    PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2024, 639
  • [23] Multivariate time series forecasting via attention-based encoder-decoder framework
    Du, Shengdong
    Li, Tianrui
    Yang, Yan
    Horng, Shi-Jinn
    NEUROCOMPUTING, 2020, 388 (388) : 269 - 279
  • [24] Multiple attention-based encoder-decoder networks for gas meter character recognition
    Li, Weidong
    Wang, Shuai
    Ullah, Inam
    Zhang, Xuehai
    Duan, Jinlong
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [25] Accurate water quality prediction with attention-based bidirectional LSTM and encoder-decoder
    Bi, Jing
    Chen, Zexian
    Yuan, Haitao
    Zhang, Jia
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [26] Lane-Level Heterogeneous Traffic Flow Prediction: A Spatiotemporal Attention-Based Encoder-Decoder Model
    Zheng, Yan
    Li, Wenquan
    Zheng, Wen
    Dong, Chunjiao
    Wang, Shengyou
    Chen, Qian
    IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE, 2023, 15 (03) : 51 - 67
  • [27] An attention-based row-column encoder-decoder model for text recognition in Japanese historical documents
    Ly, Nam Tuan
    Nguyen, Cuong Tuan
    Nakagawa, Masaki
    PATTERN RECOGNITION LETTERS, 2020, 136 : 134 - 141
  • [28] A novel approach to workload prediction using attention-based LSTM encoder-decoder network in cloud environment
    Zhu, Yonghua
    Zhang, Weilin
    Chen, Yihai
    Gao, Honghao
    EURASIP JOURNAL ON WIRELESS COMMUNICATIONS AND NETWORKING, 2019, 2019 (01)
  • [29] Breathing Sound Segmentation and Detection Using Transfer Learning Techniques on an Attention-Based Encoder-Decoder Architecture
    Hsiao, Chiu-Han
    Lin, Ting-Wei
    Lin, Chii-Wann
    Hsu, Fu-Shun
    Lin, Frank Yeong-Sung
    Chen, Chung-Wei
    Chung, Chi-Ming
    42ND ANNUAL INTERNATIONAL CONFERENCES OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY: ENABLING INNOVATIVE TECHNOLOGIES FOR GLOBAL HEALTHCARE EMBC'20, 2020, : 754 - 759
  • [30] Attention-Based Encoder-Decoder End-to-End Neural Diarization With Embedding Enhancer
    Chen, Zhengyang
    Han, Bing
    Wang, Shuai
    Qian, Yanmin
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 1636 - 1649