Dual Interactive Attention Network for Joint Entity and Relation Extraction

被引:1
|
作者
Li, Lishuang [1 ]
Wang, Zehao [1 ]
Qin, Xueyang [1 ]
Lu, Hongbin [1 ]
机构
[1] Dalian Univ Technol, Dalian, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint entity and relation extraction; Dual network; Fine-grained attention cross-unit; External attention;
D O I
10.1007/978-3-031-17120-8_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The joint entity and relation extraction method establishes a bond between tasks, surpassing sequential extraction based on the pipeline method. Many joint works focus on learning a unified representation for both tasks to explore the correlations between Named Entity Recognition (NER) and Relation Extraction (RE). However, they suffer from the feature confusion that features extracted from one task may conflict with those from the other. To address this issue, we propose a novel Dual Interactive Attention Network to learn independent representations and meanwhile guarantee bidirectional and fine-grained interaction between NER and RE. Specifically, we propose a Fine-grained Attention Cross-Unit to model interaction at the token level, which fully explores the correlation between entity and relation. To obtain task-specific representation, we introduce a novel attention mechanism that can capture the correlations among multiple sequences from the specific task and performs better than the traditional self-attention network. We conduct extensive experiments on five standard benchmarks (ACE04, ACE05, ADE, CoNLL04, SciERC) and achieve state-of-the-art performance, demonstrating the effectiveness of our approach in joint entity and relation extraction.
引用
收藏
页码:259 / 271
页数:13
相关论文
共 50 条
  • [1] Joint Entity Relation Extraction Model Based on Interactive Attention
    Hao, Xiaofang
    Zhang, Chaoqun
    Li, Xiaoxiang
    Wang, Darui
    Computer Engineering and Applications, 2024, 60 (08) : 156 - 164
  • [2] Synchronous Dual Network with Cross-Type Attention for Joint Entity and Relation Extraction
    Wu, Hui
    Shi, Xiaodong
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2769 - 2779
  • [3] A Relation-Specific Attention Network for Joint Entity and Relation Extraction
    Yuan, Yue
    Zhou, Xiaofei
    Pan, Shirui
    Zhu, Qiannan
    Song, Zeliang
    Guo, Li
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 4054 - 4060
  • [4] A Triple Relation Network for Joint Entity and Relation Extraction
    Wang, Zixiang
    Yang, Liqun
    Yang, Jian
    Li, Tongliang
    He, Longtao
    Li, Zhoujun
    ELECTRONICS, 2022, 11 (10)
  • [5] Attention Weight is Indispensable in Joint Entity and Relation Extraction
    Ouyang, Jianquan
    Zhang, Jing
    Liu, Tianming
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2022, 34 (03): : 1707 - 1723
  • [6] A Partition Filter Network for Joint Entity and Relation Extraction
    Yan, Zhiheng
    Zhang, Chong
    Fu, Jinlan
    Zhang, Qi
    Wei, Zhongyu
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 185 - 197
  • [7] Joint entity and relation extraction with position-aware attention and relation embedding
    Chen, Tiantian
    Zhou, Lianke
    Wang, Nianbin
    Chen, Xirui
    APPLIED SOFT COMPUTING, 2022, 119
  • [8] Joint Entity Relation Extraction Based on LSTM via Attention Mechanism
    Xu Cao
    Qing Shao
    Arabian Journal for Science and Engineering, 2024, 49 : 4353 - 4363
  • [9] Joint Entity Relation Extraction Based on LSTM via Attention Mechanism
    Cao, Xu
    Shao, Qing
    ARABIAN JOURNAL FOR SCIENCE AND ENGINEERING, 2024, 49 (03) : 4353 - 4363
  • [10] A Span-based Multi-Modal Attention Network for joint entity-relation extraction
    Wan, Qian
    Wei, Luona
    Zhao, Shan
    Liu, Jie
    KNOWLEDGE-BASED SYSTEMS, 2023, 262