Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

被引:0
|
作者
Zhang, Yuhao [1 ]
Qi, Peng [1 ]
Manning, Christopher D. [1 ]
机构
[1] Stanford Univ, Stanford, CA 94305 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.
引用
收藏
页码:2205 / 2215
页数:11
相关论文
共 50 条
  • [11] Reinforcement Learning with Dual Attention Guided Graph Convolution for Relation Extraction
    Li, Zhixin
    Sun, Yaru
    Tang, Suqin
    Zhang, Canlong
    Ma, Huifang
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 946 - 953
  • [12] Dependency-driven Relation Extraction with Attentive Graph Convolutional Networks
    Tian, Yuanhe
    Chen, Guimin
    Song, Yan
    Wan, Xiang
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4458 - 4471
  • [13] Long-tail Relation Extraction via Knowledge Graph Embeddings and Graph Convolution Networks
    Zhang, Ningyu
    Deng, Shumin
    Sun, Zhanlin
    Wang, Guanying
    Chen, Xi
    Zhang, Wei
    Chen, Huajun
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 3016 - 3025
  • [14] Graph Convolution Network over Dependency Structure Improve Knowledge Base Question Answering
    Zhang, Chenggong
    Zha, Daren
    Wang, Lei
    Mu, Nan
    Yang, Chengwei
    Wang, Bin
    Xu, Fuyong
    ELECTRONICS, 2023, 12 (12)
  • [15] A Long-Tail Relation Extraction Model Based on Dependency Path and Relation Graph Embedding
    Li, Yifan
    Zong, Yanxiang
    Sun, Wen
    Wu, Qingqiang
    Hong, Qingqi
    WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 408 - 423
  • [16] Dependency-position relation graph convolutional network with hierarchical attention mechanism for relation extraction
    Li, Nan
    Wang, Ying
    Liu, Tianxu
    JOURNAL OF SUPERCOMPUTING, 2024, 80 (13): : 18954 - 18976
  • [17] Efficient convolution kernels for dependency and constituent syntactic trees
    Moschitti, Alessandro
    MACHINE LEARNING: ECML 2006, PROCEEDINGS, 2006, 4212 : 318 - 329
  • [18] Dependency Tree based Chinese Relation Extraction over Web Data
    Zheng, Shanshan
    Yang, Jing
    Lin, Xin
    Gu, JunZhong
    2012 SEVENTH INTERNATIONAL CONFERENCE ON KNOWLEDGE, INFORMATION AND CREATIVITY SUPPORT SYSTEMS (KICSS 2012), 2012, : 104 - 110
  • [19] Integrating Dependency Type and Directionality into Adapted Graph Attention Networks to Enhance Relation Extraction
    Zhao, Yiran
    Wu, Di
    Dai, Shuqi
    Li, Tong
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT IV, 2024, 14807 : 287 - 305
  • [20] A Biomedical Relation Extraction Method Based on Graph Convolutional Network with Dependency Information Fusion
    Yang, Wanli
    Xing, Linlin
    Zhang, Longbo
    Cai, Hongzhen
    Guo, Maozu
    APPLIED SCIENCES-BASEL, 2023, 13 (18):