Improving Graph Convolutional Networks Based on Relation-Aware Attention for End-to-End Relation Extraction

被引:30
|
作者
Hong, Yin [1 ]
Liu, Yanxia [1 ,2 ]
Yang, Suizhu [1 ]
Zhang, Kaiwen [1 ]
Wen, Aiqing [1 ]
Hu, Jianjun [2 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou 26467, Peoples R China
[2] Univ South Carolina, Dept Comp Sci & Engn, Columbia, SC 29201 USA
来源
IEEE ACCESS | 2020年 / 8卷
基金
中国国家自然科学基金;
关键词
Graph convolutional network; joint extraction of entities and relations; attention; sequential labelling;
D O I
10.1109/ACCESS.2020.2980859
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we present a novel end-to-end neural model based on graph convolutional networks (GCN) for jointly extracting entities and relations between them. It divides the joint extraction into two sub-tasks, first detecting entity spans and identifying entity relations type simultaneously. To consider the complete interaction between entities and relations, we propose a novel relation-aware attention mechanism to obtain the relation representation between two entity spans. Therefore, a complete graph is constructed based on all extracted entity spans where the nodes are entity spans and the edges are relation representation. Besides, we improve original GCN to utilize both adjacent node features and edge information when encoding node feature. Experiments are conducted on two public datasets and our model outperforms all baseline methods.
引用
收藏
页码:51315 / 51323
页数:9
相关论文
共 50 条
  • [1] Multi-head attention graph convolutional network model: End-to-end entity and relation joint extraction based on multi-head attention graph convolutional network
    Tao, Zhihua
    Ouyang, Chunping
    Liu, Yongbin
    Chung, Tonglee
    Cao, Yixin
    [J]. CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (02) : 468 - 477
  • [2] Relation-Aware Pedestrian Attribute Recognition with Graph Convolutional Networks
    Tan, Zichang
    Yan, Yang
    Wan, Jun
    Guo, Guodong
    Li, Stan Z.
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 12055 - 12062
  • [3] A semantic relation-aware deep neural network model for end-to-end conversational recommendation
    Wu, Jiajin
    Yang, Bo
    Li, Dongsheng
    Deng, Lihui
    [J]. APPLIED SOFT COMPUTING, 2023, 132
  • [4] RAGA: Relation-Aware Graph Attention Networks for Global Entity Alignment
    Zhu, Renbo
    Ma, Meng
    Wang, Ping
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT I, 2021, 12712 : 501 - 513
  • [5] Neighbor Relation-Aware Graph Convolutional Network for Recommendation
    Sun, Aijing
    Wang, Guoqing
    [J]. Computer Engineering and Applications, 2023, 59 (09): : 112 - 122
  • [6] Attention Guided Graph Convolutional Networks for Relation Extraction
    Guo, Zhijiang
    Zhang, Yan
    Lu, Wei
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 241 - 251
  • [7] End-to-End Relation Extraction Using Markov Logic Networks
    Pawar, Sachin
    Bhattacharya, Pushpak
    Palshikar, Girish K.
    [J]. COMPUTATIONAL LINGUISTICS AND INTELLIGENT TEXT PROCESSING, (CICLING 2016), PT II, 2018, 9624 : 535 - 551
  • [8] Relation-aware Graph Convolutional Networks for Multi-relational Network Alignment
    Fang, Yujie
    Li, Xin
    Ye, Rui
    Tan, Xiaoyan
    Zhao, Peiyao
    Wang, Mingzhong
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2023, 14 (02)
  • [9] End-to-end Relation Extraction using Neural Networks and Markov Logic Networks
    Pawar, Sachin
    Bhattacharyya, Pushpak
    Palshikar, Girish K.
    [J]. 15TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2017), VOL 1: LONG PAPERS, 2017, : 818 - 827
  • [10] A relation-aware heterogeneous graph convolutional network for relationship prediction
    Mo, Xian
    Tang, Rui
    Liu, Hao
    [J]. INFORMATION SCIENCES, 2023, 623 : 311 - 323