Dual Interactive Attention Network for Joint Entity and Relation Extraction

被引:1
|
作者
Li, Lishuang [1 ]
Wang, Zehao [1 ]
Qin, Xueyang [1 ]
Lu, Hongbin [1 ]
机构
[1] Dalian Univ Technol, Dalian, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint entity and relation extraction; Dual network; Fine-grained attention cross-unit; External attention;
D O I
10.1007/978-3-031-17120-8_21
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The joint entity and relation extraction method establishes a bond between tasks, surpassing sequential extraction based on the pipeline method. Many joint works focus on learning a unified representation for both tasks to explore the correlations between Named Entity Recognition (NER) and Relation Extraction (RE). However, they suffer from the feature confusion that features extracted from one task may conflict with those from the other. To address this issue, we propose a novel Dual Interactive Attention Network to learn independent representations and meanwhile guarantee bidirectional and fine-grained interaction between NER and RE. Specifically, we propose a Fine-grained Attention Cross-Unit to model interaction at the token level, which fully explores the correlation between entity and relation. To obtain task-specific representation, we introduce a novel attention mechanism that can capture the correlations among multiple sequences from the specific task and performs better than the traditional self-attention network. We conduct extensive experiments on five standard benchmarks (ACE04, ACE05, ADE, CoNLL04, SciERC) and achieve state-of-the-art performance, demonstrating the effectiveness of our approach in joint entity and relation extraction.
引用
收藏
页码:259 / 271
页数:13
相关论文
共 50 条
  • [41] Multi-information interaction graph neural network for joint entity and relation extraction
    Zhang, Yini
    Zhang, Yuxuan
    Wang, Zijing
    Peng, Huanchun
    Yang, Yongsheng
    Li, Yuanxiang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 235
  • [42] A Knowledge-Enriched and Span-Based Network for Joint Entity and Relation Extraction
    Ding, Kun
    Liu, Shanshan
    Zhang, Yuhao
    Zhang, Hui
    Zhang, Xiaoxiong
    Wu, Tongtong
    Zhou, Xiaolei
    CMC-COMPUTERS MATERIALS & CONTINUA, 2021, 68 (01): : 377 - 389
  • [43] Interactive learning for joint event and relation extraction
    Jingli Zhang
    Yu Hong
    Wenxuan Zhou
    Jianmin Yao
    Min Zhang
    International Journal of Machine Learning and Cybernetics, 2020, 11 : 449 - 461
  • [44] Interactive learning for joint event and relation extraction
    Zhang, Jingli
    Hong, Yu
    Zhou, Wenxuan
    Yao, Jianmin
    Zhang, Min
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (02) : 449 - 461
  • [45] A discrete convolutional network for entity relation extraction
    Yang, Weizhe
    Qin, Yongbin
    Wang, Kai
    Hu, Ying
    Huang, Ruizhang
    Chen, Yanping
    NEURAL NETWORKS, 2025, 184
  • [46] Boundary regression model for joint entity and relation extraction
    Tang, Ruixue
    Chen, Yanping
    Qin, Yongbin
    Huang, Ruizhang
    Zheng, Qinghua
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 229
  • [47] CyberRel: Joint Entity and Relation Extraction for Cybersecurity Concepts
    Guo, Yongyan
    Liu, Zhengyu
    Huang, Cheng
    Liu, Jiayong
    Jing, Wangyuan
    Wang, Ziwang
    Wang, Yanghao
    INFORMATION AND COMMUNICATIONS SECURITY (ICICS 2021), PT I, 2021, 12918 : 447 - 463
  • [48] Joint Semantic Relation Extraction for Multiple Entity Packets
    Shi, Yuncheng
    Wang, Jiahui
    Huang, Zehao
    Li, Shiyao
    Xue, Chengjie
    Yue, Kun
    WEB AND BIG DATA, APWEB-WAIM 2024, PT I, 2024, 14961 : 74 - 89
  • [49] Enhancing interaction representation for joint entity and relation extraction
    Tang, Ruixue
    Chen, Yanping
    Huang, Ruizhang
    Qin, Yongbin
    COGNITIVE SYSTEMS RESEARCH, 2023, 82
  • [50] Joint Entity and Relation Extraction Based on Reinforcement Learning
    Zhou, Xin
    Liu, Luping
    Luo, Xiaodong
    Chen, Haiqiang
    Qing, Linbo
    He, Xiaohai
    IEEE ACCESS, 2019, 7 : 125688 - 125699