RETRACTED: English-Chinese Machine Translation Model Based on Bidirectional Neural Network with Attention Mechanism (Retracted Article)

被引:6
|
作者
Li Yonglan [1 ]
He Wenjia [2 ]
机构
[1] Guizhou Univ Finance & Econ, Sch Foreign Languages, Guiyang 550025, Guizhou, Peoples R China
[2] Guizhou Univ Finance & Econ, Res Ctr Big Data Corpus & Language Projects, Sch Foreign Languages, Guiyang 550025, Peoples R China
关键词
D O I
10.1155/2022/5199248
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In recent years, with the development of deep learning, machine translation using neural network has gradually become the mainstream method in industry and academia. The existing Chinese-English machine translation models generally adopt the deep neural network architecture based on attention mechanism. However, it is still a challenging problem to model short and long sequences simultaneously. Therefore, a bidirectional LSTM model integrating attention mechanism is proposed. Firstly, by using the word vector as the input data of the translation model, the linguistic symbols used in the translation process are mathematized. Secondly, two attention mechanisms are designed: local attention mechanism and global attention mechanism. The local attention mechanism is mainly used to learn which words or phrases in the input sequence are more important for modeling, while the global attention mechanism is used to learn which layer of expression vector in the input sequence is more critical. Bidirectional LSTM can better fuse the feature information in the input sequence, while bidirectional LSTM with attention mechanism can simultaneously model short and long sequences. The experimental results show that compared with many existing translation models, the bidirectional LSTM model with attention mechanism can effectively improve the quality of machine translation.
引用
收藏
页数:11
相关论文
共 50 条