AMR Parsing via Graph⇆Sequence Iterative Inference

被引:0
|
作者
Cai, Deng [1 ]
Lam, Wai [1 ]
机构
[1] Chinese Univ Hong Kong, Hong Kong, Peoples R China
来源
58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020) | 2020年
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a new end-to-end model that treats AMR parsing as a series of dual decisions on the input sequence and the incrementally constructed graph. At each time step, our model performs multiple rounds of attention, reasoning, and composition that aim to answer two critical questions: (1) which part of the input sequence to abstract; and (2) where in the output graph to construct the new concept. We show that the answers to these two questions are mutually causalities. We design a model based on iterative inference that helps achieve better answers in both perspectives, leading to greatly improved parsing accuracy. Our experimental results significantly outperform all previously reported SMATCH scores by large margins. Remarkably, without the help of any large-scale pre-trained language model (e.g., BERT), our model already surpasses previous state-of-the-art using BERT. With the help of BERT, we can push the state-of-the-art results to 80.2% on LDC2017T10 (AMR 2.0) and 75.4% on LDC2014T12 (AMR 1.0).
引用
收藏
页码:1290 / 1301
页数:12
相关论文
共 50 条
  • [21] From text to graph: a general transition-based AMR parsing using neural network
    Gu, Min
    Gu, Yanhui
    Luo, Weilan
    Xu, Guandong
    Yang, Zhenglu
    Zhou, Junsheng
    Qu, Weiguang
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (11): : 6009 - 6025
  • [22] AMR Parsing with Cache Transition Systems
    Peng, Xiaochang
    Gildea, Daniel
    Satta, Giorgio
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4897 - 4904
  • [23] Hierarchical Curriculum Learning for AMR Parsing
    Wang, Peiyi
    Chen, Liang
    Liu, Tianyu
    Dai, Damai
    Cao, Yunbo
    Chang, Baobao
    Sui, Zhifang
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022): (SHORT PAPERS), VOL 2, 2022, : 333 - 339
  • [24] Improving AMR parsing by exploiting the dependency parsing as an auxiliary task
    Taizhong Wu
    Junsheng Zhou
    Weiguang Qu
    Yanhui Gu
    Bin Li
    Huilin Zhong
    Yunfei Long
    Multimedia Tools and Applications, 2021, 80 : 30827 - 30838
  • [25] Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement
    Mohammadshahi, Alireza
    Henderson, James
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2021, 9 : 120 - 138
  • [26] AMR Parsing with Latent Structural Information
    Zhou, Qiji
    Zhang, Yue
    Ji, Donghong
    Tang, Hao
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 4306 - 4319
  • [27] Stacked AMR Parsing with Silver Data
    Xia, Qingrong
    Li, Zhenghua
    Wang, Rui
    Zhang, Min
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 4729 - 4738
  • [28] Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing
    Zhou, Jiawei
    Naseem, Tahira
    Astudillo, Ramon Fernandez
    Lee, Young-Suk
    Florian, Radu
    Roukos, Salim
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6279 - 6290
  • [29] AMR Parsing with Causal Hierarchical Attention and Pointers
    Lou, Chao
    Tu, Kewei
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 8942 - 8955
  • [30] Multilingual AMR Parsing with Noisy Knowledge Distillation☆
    Cai, Deng
    Li, Xin
    Ho, Jackie Chun-Sing
    Bing, Lidong
    Lam, Wai
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2778 - 2789