RMAN: Relational multi-head attention neural network for joint extraction of entities and relations

被引:25
|
作者
Lai, Taiqu [1 ]
Cheng, Lianglun [2 ]
Wang, Depei [1 ]
Ye, Haiming [2 ]
Zhang, Weiwen [2 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangzhou, Peoples R China
[2] Guangdong Univ Technol, Sch Comp, Guangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Joint extraction of entities and relations; Relation feature; Multi-head attention; Sequence annotation; INFORMATION;
D O I
10.1007/s10489-021-02600-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The task of extracting entities and relations has evolved from distributed extraction to joint extraction. The joint model overcomes the disadvantages of distributed extraction method and strengthens the information interaction between entities and relations. However, the existing methods of the joint model rarely pay attention to the semantic information between words, which have limitations in solving the problem of overlapping relations. In this paper, we propose an RMAN model for joint extraction of entities and relations, which includes multi-feature fusion encoder sentence representation and decoder sequence annotation. We first add a multi-head attention layer after Bi-LSTM to obtain sentence representations, and leverage the attention mechanism to capture relation-based sentence representations. Then, we perform sequence annotation on the sentence representation to obtain entity pairs. Experiments on NYT-single, NYT-multi and WebNLG datasets demonstrate that our model can efficiently extract overlapping triples, which outperforms other baselines.
引用
收藏
页码:3132 / 3142
页数:11
相关论文
共 50 条
  • [21] Extracting biomedical relations via a multi-head attention based graph convolutional network
    Wang, Erniu
    Wang, Fan
    Yang, Zhihao
    Wang, Lei
    Zhang, Yin
    Lin, Hongfei
    Wang, Jian
    2020 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE, 2020, : 793 - 798
  • [22] A Multi-Head Convolutional Neural Network with Multi-Path Attention Improves Image Denoising
    Zhang, Jiahong
    Qu, Meijun
    Wang, Ye
    Cao, Lihong
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2022, 13631 : 338 - 351
  • [23] MSnet: Multi-Head Self-Attention Network for Distantly Supervised Relation Extraction
    Sun, Tingting
    Zhang, Chunhong
    Ji, Yang
    Hu, Zheng
    IEEE ACCESS, 2019, 7 : 54472 - 54482
  • [24] Network Configuration Entity Extraction Method Based on Transformer with Multi-Head Attention Mechanism
    Yang, Yang
    Qu, Zhenying
    Yan, Zefan
    Gao, Zhipeng
    Wang, Ti
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 78 (01): : 735 - 757
  • [25] Arabic cyberbullying detection system using convolutional neural network and multi-head attention
    Azzeh M.
    Alhijawi B.
    Tabbaza A.
    Alabboshi O.
    Hamdan N.
    Jaser D.
    International Journal of Speech Technology, 2024, 27 (03) : 521 - 537
  • [26] TMH: Two-Tower Multi-Head Attention neural network for CTR prediction
    An, Zijian
    Joe, Inwhee
    PLOS ONE, 2024, 19 (03):
  • [27] A Graph Neural Network Social Recommendation Algorithm Integrating the Multi-Head Attention Mechanism
    Yi, Huawei
    Liu, Jingtong
    Xu, Wenqian
    Li, Xiaohui
    Qian, Huihui
    ELECTRONICS, 2023, 12 (06)
  • [28] On the diversity of multi-head attention
    Li, Jian
    Wang, Xing
    Tu, Zhaopeng
    Lyu, Michael R.
    NEUROCOMPUTING, 2021, 454 : 14 - 24
  • [29] Joint Drug Entities and Relations Extraction Based on Neural Networks
    Cao M.
    Yang Z.
    Luo L.
    Lin H.
    Wang J.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2019, 56 (07): : 1432 - 1440
  • [30] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394