Arabic Machine Transliteration using an Attention-based Encoder-decoder Model

被引:17
|
作者
Ameur, Mohamed Seghir Hadj [1 ]
Meziane, Farid [2 ]
Guessoum, Ahmed [1 ]
机构
[1] USTHB Univ, TALAA Grp, Algiers, Algeria
[2] Univ Salford, Informat Res Ctr, Salford M5 4WT, Lancs, England
关键词
Natural Language Processing; Arabic Language; Arabic Transliteration; Deep Learning; Sequence-to-sequence Models; Encoder-decoder Architecture; Recurrent Neural Networks;
D O I
10.1016/j.procs.2017.10.120
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Transliteration is the process of converting words from a given source language alphabet to a target language alphabet, in a way that best preserves the phonetic and orthographic aspects of the transliterated words. Even though an important effort has been made towards improving this process for many languages such as English, French and Chinese, little research work has been accomplished with regard to the Arabic language. In this work, an attention-based encoder-decoder system is proposed for the task of Machine Transliteration between the Arabic and English languages. Our experiments proved the efficiency of our proposal approach in comparison to some previous research developed in this area. (C) 2017 The Authors. Published by Elsevier B.V.
引用
收藏
页码:287 / 297
页数:11
相关论文
共 50 条
  • [11] A Novel Dynamic Attack on Classical Ciphers Using an Attention-Based LSTM Encoder-Decoder Model
    Ahmadzadeh, Ezat
    Kim, Hyunil
    Jeong, Ongee
    Moon, Inkyu
    IEEE ACCESS, 2021, 9 (09): : 60960 - 60970
  • [12] Enhanced Attention-Based Encoder-Decoder Framework for Text Recognition
    Prabu, S.
    Sundar, K. Joseph Abraham
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 35 (02): : 2071 - 2086
  • [13] ATTENTION-BASED ENCODER-DECODER NETWORK FOR SINGLE IMAGE DEHAZING
    Gao, Shunan
    Zhu, Jinghua
    Xi, Heran
    2021 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW), 2021,
  • [14] Proper Error Estimation and Calibration for Attention-Based Encoder-Decoder Models
    Lee, Mun-Hak
    Chang, Joon-Hyuk
    IEEE/ACM Transactions on Audio Speech and Language Processing, 2024, 32 : 4919 - 4930
  • [15] Attention-Based Encoder-Decoder Network for Prediction of Electromagnetic Scattering Fields
    Zhang, Ying
    He, Mang
    2022 IEEE 10TH ASIA-PACIFIC CONFERENCE ON ANTENNAS AND PROPAGATION, APCAP, 2022,
  • [16] A Neural Attention-Based Encoder-Decoder Approach for English to Bangla Translation
    Al Shiam, Abdullah
    Redwan, Sadi Md.
    Kabir, Humaun
    Shin, Jungpil
    COMPUTER SCIENCE JOURNAL OF MOLDOVA, 2023, 31 (01) : 70 - 85
  • [17] Dense Video Captioning with Hierarchical Attention-Based Encoder-Decoder Networks
    Yu, Mingjing
    Zheng, Huicheng
    Liu, Zehua
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [18] Modeling User Session and Intent with an Attention-based Encoder-Decoder Architecture
    Loyola, Pablo
    Liu, Chen
    Hirate, Yu
    PROCEEDINGS OF THE ELEVENTH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS'17), 2017, : 147 - 151
  • [19] Investigating Methods to Improve Language Model Integration for Attention-based Encoder-Decoder ASR Models
    Zeineldeen, Mohammad
    Glushko, Aleksandr
    Michel, Wilfried
    Zeyer, Albert
    Schlueter, Ralf
    Ney, Hermann
    INTERSPEECH 2021, 2021, : 2856 - 2860
  • [20] Self-Supervised Pre-Training for Attention-Based Encoder-Decoder ASR Model
    Gao, Changfeng
    Cheng, Gaofeng
    Li, Ta
    Zhang, Pengyuan
    Yan, Yonghong
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2022, 30 : 1763 - 1774