Densely Connected Graph Attention Network Based on Iterative Path Reasoning for Document-Level Relation Extraction

被引:0
|
作者
Zhang, Hongya [1 ]
Huang, Zhen [1 ]
Li, Zhenzhen [1 ]
Li, Dongsheng [1 ]
Liu, Feng [1 ]
机构
[1] Natl Univ Def Technol, Sch Comp Sci, Changsha, Peoples R China
基金
国家重点研发计划;
关键词
Relation extraction; Densely connected graph attention network; Iterative path reasoning;
D O I
10.1007/978-3-030-75765-6_22
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Document-level relation extraction is a challenging task in Natural Language Processing, which extracts relations expressed with one or multiple sentences. It plays an important role in data mining and information retrieval. The key challenge comes from the indirect relations expressed across sentences. Graph-based neural networks have been proved effective for modeling structural information among the document. Existing methods enhance the graph models by using either the attention mechanism or the iterative path reasoning, which is not enough to capture all the effective structural information. In this paper, we propose a densely connected graph attention network based on iterative path reasoning (IPR-DCGAT) for document-level relation extraction. Our approach uses densely connected graph attention network to model the local and global information among the document. In addition, we propose to learn dynamic path weights for reasoning relations across sentences. Extensive experiments on three datasets demonstrate the effectiveness of our approach. Our model achieves 84% F1 score on CDR, which is about 16.3%-22.5% higher than previous models with a significant margin. Meanwhile, the results of our approach are also comparably superior to the state-of-the-art results on the GDA and DocRED dataset.
引用
收藏
页码:269 / 281
页数:13
相关论文
共 50 条
  • [1] Graph neural networks with selective attention and path reasoning for document-level relation extraction
    Hang, Tingting
    Feng, Jun
    Wang, Yunfeng
    Yan, Le
    APPLIED INTELLIGENCE, 2024, 54 (07) : 5353 - 5372
  • [2] Document-Level Relation Extraction with Path Reasoning
    Xu, Wang
    Chen, Kehai
    Zhao, Tiejun
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (04)
  • [3] Double Graph Based Reasoning for Document-level Relation Extraction
    Zeng, Shuang
    Xu, Runxin
    Chang, Baobao
    Li, Lei
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1630 - 1640
  • [4] Document-Level Relation Extraction with Deep Gated Graph Reasoning
    Liang, Zeyu
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2024, 32 (07) : 1037 - 1050
  • [5] Document-level relation extraction with multi-layer heterogeneous graph attention network
    Wang, Nianbin
    Chen, Tiantian
    Ren, Chaoqi
    Wang, Hongbin
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 123
  • [6] Document-Level Relation Extraction with Cross-sentence Reasoning Graph
    Liu, Hongfei
    Kang, Zhao
    Zhang, Lizong
    Tian, Ling
    Hua, Fujun
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2023, PT I, 2023, 13935 : 316 - 328
  • [7] Denoising Graph Inference Network for Document-Level Relation Extraction
    Wang, Hailin
    Qin, Ke
    Duan, Guiduo
    Luo, Guangchun
    BIG DATA MINING AND ANALYTICS, 2023, 6 (02) : 248 - 262
  • [8] Discriminative Reasoning for Document-level Relation Extraction
    Xu, Wang
    Chen, Kehai
    Zhao, Tiejun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 1653 - 1663
  • [9] Dialogue Relation Extraction with Document-Level Heterogeneous Graph Attention Networks
    Chen, Hui
    Hong, Pengfei
    Han, Wei
    Majumder, Navonil
    Poria, Soujanya
    COGNITIVE COMPUTATION, 2023, 15 (02) : 793 - 802
  • [10] Dialogue Relation Extraction with Document-Level Heterogeneous Graph Attention Networks
    Hui Chen
    Pengfei Hong
    Wei Han
    Navonil Majumder
    Soujanya Poria
    Cognitive Computation, 2023, 15 : 793 - 802