共 27 条
- [1] Kalchbrenner N, Blunsom P., Recurrent continuous translation models, Proc. of the 2013 Conf. on Empirical Methods in Natural Language Processing, pp. 1700-1709, (2013)
- [2] Sutskever I, Vinyals O, Le QV., Sequence to sequence learning with neural networks, Advances in Neural Information Processing Systems, pp. 3104-3112, (2014)
- [3] Bahdanau D, Cho K, Bengio Y., Neural machine translation by jointly learning to align and translate, Proc. of the 3rd Int’l Conf. on Learning Representations, (2015)
- [4] Gehring J, Auli M, Grangier D, Yarats D, Dauphin YN., Convolutional sequence to sequence learning, Proc. of the 34th Int’l Conf. on Machine Learning, pp. 1243-1252, (2017)
- [5] Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I., Attention is all you need, Advances in Neural Information Processing Systems, pp. 5998-6008, (2017)
- [6] Koehn P, Och FJ, Marcu D., Statistical phrase-based translation, Proc. of the 2003 Conf. of the North American Chapter of the Association for Computational Linguistics on Human Language Technology, pp. 48-54, (2003)
- [7] Li YC, Xiong DY, Zhang M., A survey of neural machine translation, Chinese Journal of Computers, 41, 12, pp. 2734-2755, (2018)
- [8] Tu ZP, Lu ZD, Liu Y, Liu XH, Li H., Modeling coverage for neural machine translation, Proc. of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 76-85, (2016)
- [9] Mi HT, Sankaran B, Wang ZG, Ittycheriah A., Coverage embedding models for neural machine translation, Proc. of the 2016 Conf. on Empirical Methods in Natural Language Processing, pp. 955-960, (2016)
- [10] Feng S, Liu SJ, Yang N, Li M, Zhou M, Zhu KQ., Improving attention modeling with implicit distortion and fertility for machine translation, Proc. of the 26th Int’l Conf. on Computational Linguistics: Technical Papers (COLING 2016), pp. 3082-3092, (2016)