RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

被引:25
|
作者
Lai, Taiqu [1 ]
Cheng, Lianglun [2 ]
Wang, Depei [1 ]
Ye, Haiming [2 ]
Zhang, Weiwen [2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Comp, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint extraction of entities and relations; Relation feature; Multi-head attention; Sequence annotation; INFORMATION;
D O I
10.1007/s10489-021-02600-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
引用
收藏
页码:3132 / 3142
页数:11
相关论文
共 50 条
  • [41] ResGAT: an improved graph neural network based on multi-head attention mechanism and residual network for paper classification
    Xuejian Huang
    Zhibin Wu
    Gensheng Wang
    Zhipeng Li
    Yuansheng Luo
    Xiaofang Wu
    Scientometrics, 2024, 129 : 1015 - 1036
  • [42] Multi-branch fusion graph neural network based on multi-head attention for childhood seizure detection
    Li, Yang
    Yang, Yang
    Song, Shangling
    Wang, Hongjun
    Sun, Mengzhou
    Liang, Xiaoyun
    Zhao, Penghui
    Wang, Baiyang
    Wang, Na
    Sun, Qiyue
    Han, Zijuan
    FRONTIERS IN PHYSIOLOGY, 2024, 15
  • [43] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793
  • [44] ResGAT: an improved graph neural network based on multi-head attention mechanism and residual network for paper classification
    Huang, Xuejian
    Wu, Zhibin
    Wang, Gensheng
    Li, Zhipeng
    Luo, Yuansheng
    Wu, Xiaofang
    SCIENTOMETRICS, 2024, 129 (02) : 1015 - 1036
  • [45] Span-based Joint Extracting Subjects and Objects and Classifying Relations with Multi-head Self-attention
    Zheng, Xuanang
    Zhang, Lingli
    Zheng, Wei
    Hu, Wenxin
    2021 IEEE 9TH INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATION AND NETWORKS (ICICN 2021), 2021, : 457 - 463
  • [46] Combining Multi-Head Attention and Sparse Multi-Head Attention Networks for Session-Based Recommendation
    Zhao, Zhiwei
    Wang, Xiaoye
    Xiao, Yingyuan
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [47] Multi-Head Attention for End-to-End Neural Machine Translation
    Fung, Ivan
    Mak, Brian
    2018 11TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING (ISCSLP), 2018, : 250 - 254
  • [48] Neural Linguistic Steganalysis via Multi-Head Self-Attention
    Jiao, Sai-Mei
    Wang, Hai-feng
    Zhang, Kun
    Hu, Ya-qi
    JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2021, 2021 (2021)
  • [49] Duplicate Question Detection based on Neural Networks and Multi-head Attention
    Zhang, Heng
    Chen, Liangyu
    PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 13 - 18
  • [50] Joint extraction of entities and relations in biomedical text with self--attention mechanism
    Chen, Jizhi
    Gu, Junzhong
    BASIC & CLINICAL PHARMACOLOGY & TOXICOLOGY, 2019, 125 : 3 - 3