RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

被引:25
|
作者
Lai, Taiqu [1 ]
Cheng, Lianglun [2 ]
Wang, Depei [1 ]
Ye, Haiming [2 ]
Zhang, Weiwen [2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Comp, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint extraction of entities and relations; Relation feature; Multi-head attention; Sequence annotation; INFORMATION;
D O I
10.1007/s10489-021-02600-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
引用
收藏
页码:3132 / 3142
页数:11
相关论文
共 50 条
  • [31] Multi-Head Attention for Multi-Modal Joint Vehicle Motion Forecasting
    Mercat, Jean
    Gilles, Thomas
    El Zoghby, Nicole
    Sandou, Guillaume
    Beauvois, Dominique
    Gil, Guillermo Pita
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 9638 - 9644
  • [32] Cross Aggregation of Multi-head Attention for Neural Machine Translation
    Cao, Juncheng
    Zhao, Hai
    Yu, Kai
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 380 - 392
  • [33] Multi-head GAGNN: A Multi-head Guided Attention Graph Neural Network for Modeling Spatio-temporal Patterns of Holistic Brain Functional Networks
    Yan, Jiadong
    Chen, Yuzhong
    Yang, Shimin
    Zhang, Shu
    Jiang, Mingxin
    Zhao, Zhongbo
    Zhang, Tuo
    Zhao, Yu
    Becker, Benjamin
    Liu, Tianming
    Kendrick, Keith
    Jiang, Xi
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT VII, 2021, 12907 : 564 - 573
  • [34] Hybrid neural network model based on multi-head attention for English text emotion analysis
    Li, Ping
    EAI ENDORSED TRANSACTIONS ON SCALABLE INFORMATION SYSTEMS, 2022, 9 (35):
  • [35] Multi-Head Attention-Based Hybrid Deep Neural Network for Aeroengine Risk Assessment
    Li, Jian-Hang
    Gao, Xin-Yue
    Lu, Xiang
    Liu, Guo-Dong
    IEEE ACCESS, 2023, 11 : 113376 - 113389
  • [36] Hierarchical Task-aware Multi-Head Attention Network
    Du, Jing
    Yao, Lina
    Wang, Xianzhi
    Guo, Bin
    Yu, Zhiwen
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 1933 - 1937
  • [37] CephaNN: A Multi-Head Attention Network for Cephalometric Landmark Detection
    Qian, Jiahong
    Luo, Weizhi
    Cheng, Ming
    Tao, Yubo
    Lin, Jun
    Lin, Hai
    IEEE ACCESS, 2020, 8 : 112633 - 112641
  • [38] Multi-Head Attention Graph Network for Few Shot Learning
    Zhang, Baiyan
    Ling, Hefei
    Li, Ping
    Wang, Qian
    Shi, Yuxuan
    Wu, Lei
    Wang, Runsheng
    Shen, Jialie
    CMC-COMPUTERS MATERIALS & CONTINUA, 2021, 68 (02): : 1505 - 1517
  • [39] Accurate prediction of drug combination risk levels based on relational graph convolutional network and multi-head attention
    He, Shi-Hui
    Yun, Lijun
    Yi, Hai-Cheng
    JOURNAL OF TRANSLATIONAL MEDICINE, 2024, 22 (01)
  • [40] Serialized Multi-Layer Multi-Head Attention for Neural Speaker Embedding
    Zhu, Hongning
    Lee, Kong Aik
    Li, Haizhou
    INTERSPEECH 2021, 2021, : 106 - 110