RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

被引:0
|
作者
Taiqu Lai
Lianglun Cheng
Depei Wang
Haiming Ye
Weiwen Zhang
机构
[1] Guangdong University of Technology,School of Automation
[2] Guangdong University of Technology,School of Computers
来源
Applied Intelligence | 2022年 / 52卷
关键词
Joint extraction of entities and relations; Relation feature; Multi-head attention; Sequence annotation;
D O I
暂无
中图分类号
学科分类号
摘要
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
引用
收藏
页码:3132 / 3142
页数:10
相关论文
共 50 条
  • [1] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Lai, Taiqu
    Cheng, Lianglun
    Wang, Depei
    Ye, Haiming
    Zhang, Weiwen
    APPLIED INTELLIGENCE, 2022, 52 (03) : 3132 - 3142
  • [2] Joint extraction of entities and relations based on character graph convolutional network and Multi-Head Self-Attention Mechanism
    Meng, Zhao
    Tian, Shengwei
    Yu, Long
    Lv, Yalong
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2021, 33 (02) : 349 - 362
  • [3] Joint Extraction of Clinical Entities and Relations Using Multi-head Selection Method
    Fang, Xintao
    Song, Yuting
    Maeda, Akira
    2021 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2021, : 99 - 104
  • [4] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [5] Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network
    Xue, Yafei
    Zhu, Jing
    Lyu, Jing
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [6] Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network
    Tao, Zhihua
    Ouyang, Chunping
    Liu, Yongbin
    Chung, Tonglee
    Cao, Yixin
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (02) : 468 - 477
  • [7] Relation guided and attention enhanced multi-head selection for relational facts extraction
    Zeng, Daojian
    Zhao, Chao
    Xv, Lu
    Dai, Jianhua
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 250
  • [8] Multi-head Attention with Hint Mechanisms for Joint Extraction of Entity and Relation
    Fang, Chih-Hsien
    Chen, Yi-Ling
    Yeh, Mi-Yen
    Lin, Yan-Shuo
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS: DASFAA 2021 INTERNATIONAL WORKSHOPS, 2021, 12680 : 321 - 335
  • [9] Joint extraction of entities and relations via an entity correlated attention neural model
    Li, Ren
    Li, Dong
    Yang, Jianxi
    Xiang, Fangyue
    Ren, Hao
    Jiang, Shixin
    Zhang, Luyi
    INFORMATION SCIENCES, 2021, 581 (581) : 179 - 193
  • [10] Joint Extraction of Multiple Relations and Entities by Using a Hybrid Neural Network
    Zhou, Peng
    Zheng, Suncong
    Xu, Jiaming
    Qi, Zhenyu
    Bao, Hongyun
    Xu, Bo
    CHINESE COMPUTATIONAL LINGUISTICS AND NATURAL LANGUAGE PROCESSING BASED ON NATURALLY ANNOTATED BIG DATA, CCL 2017, 2017, 10565 : 135 - 146