共 45 条
- [1] Banarescu L, Bonial C, Cai S, Et al., Abstract meaning representation for sembanking, Proc. of the 7th Linguistic Annotation Workshop and Interoperability with Discourse, pp. 178-186, (2013)
- [2] Tamchyna A, Quirk C, Galley M., A discriminative model for semantics-to-string translation, Proc. of the 1st Workshop on Semantics-driven Statistical Machine Translation, pp. 30-36, (2015)
- [3] Mitra A, Baral C., Addressing a question answering challenge by combining statistical methods with inductive rule learning and reasoning, Proc. of the 30th AAAI Conf. on Artificial Intelligence, pp. 2779-2785, (2016)
- [4] Li X, Nguyen T H, Cao K, Et al., Improving event detection with abstract meaning representation, Proc. of the 1st Workshop on Computing News Storylines, pp. 11-15, (2015)
- [5] Xu K, Wu L, Wang Z, Et al., Graph2seq: Graph to sequence learning with attention-based neural networks, (2018)
- [6] Flanigan J, Dyer C, Smith NA, Et al., Generation from abstract meaning representation using tree transducers, Proc. of the 2016 Conf. of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 731-739, (2016)
- [7] Song L, Peng X, Zhang Y, Et al., AMR-to-text generation with synchronous node replacement grammar, Proc. of the 55th Annual Meeting of the Association for Computational Linguistics, 2, pp. 7-13, (2017)
- [8] Pourdamghani N, Knight K, Hermjakob U., Generating English from abstract meaning representations, Proc. of the 9th Int'l Natural Language Generation Conf, 21, (2016)
- [9] Ferreira TC, Calixto I, Wubben S, Et al., Linguistic realisation as machine translation: Comparing different MT models for AMR-to-text generation, Proc. of the 10th Int'l Conf. on Natural Language Generation, pp. 1-10, (2017)
- [10] Zhu J, Li J, Zhu M, Et al., Modeling graph structure in transformer for better AMR-to-text generation, Proc. of the 2019 Conf. on Empirical Methods in Natural Language Processing and the 9th Int'l Joint Conf. on Natural Language Processing, pp. 5462-5471, (2019)