Distant Supervision Relation Extraction with Improved PCNN and Multi-level Attention

被引:1
|
作者
Zou, Yang [1 ]
Wang, Qifei [1 ]
Wang, Zhen [1 ]
Zhou, Jian [1 ]
Zeng, Xiaoqin [1 ]
机构
[1] Hohai Univ, Sch Comp & Informat, Inst Intelligence Sci & Technol, Nanjing 210098, Peoples R China
关键词
relation extraction; distant supervision; I-PCNN; multi-level attention;
D O I
10.1007/978-3-031-40283-8_27
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the research field of relation extraction, a large amount of labeled training data can be quickly obtained through distant supervision. However, distant supervision will be inevitably accompanied by the problem of wrong labelling, and the noisy data can substantially hurt the performance of relation extraction model. Previous studies mainly focus on de-noising by designing neural networks with intra-bag or inter-bag attention, but do not sufficiently take noise interference at different levels into account. Moreover, the conventional approach Piecewise Convolutional Neural Network (PCNN) encodes sentences based on an assumption that each segmented feature contributes equally to the relation, which is unreasonable. To alleviate these issues, we propose a distant supervision relation extraction model with Improved PCNN(I-PCNN) and multi-level attention. By incorporating word-level, sentence-level and bag-level attention into the model, it can effectively reduce the noisy data in a data set. Besides, it also enhances PCNN by augmenting self-attention, which can promote the encoding quality of sentences. Experiments conducted on New York Times (NYT) data set show that the proposed model evidently outperforms several baseline models in terms of the metrics Precision, Recall, AUC, and P@N.
引用
收藏
页码:315 / 327
页数:13
相关论文
共 50 条
  • [1] DISTANT SUPERVISION RELATION EXTRACTION BASED ON MUTUAL INFORMATION AND MULTI-LEVEL ATTENTION
    Ye, Y.
    Jiang, S.
    Wang, S.
    Li, H.
    NEURAL NETWORK WORLD, 2022, 32 (03) : 163 - 179
  • [2] Bootstrapped Multi-level Distant Supervision for Relation Extraction
    He, Ying
    Li, Zhixu
    Liu, Guanfeng
    Cao, Fangfei
    Chen, Zhigang
    Wang, Ke
    Ma, Jie
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2018, PT I, 2018, 11233 : 408 - 423
  • [3] End-to-end relation extraction based on bootstrapped multi-level distant supervision
    He, Ying
    Li, Zhixu
    Yang, Qiang
    Chen, Zhigang
    Liu, An
    Zhao, Lei
    Zhou, Xiaofang
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (05): : 2933 - 2956
  • [4] End-to-end relation extraction based on bootstrapped multi-level distant supervision
    Ying He
    Zhixu Li
    Qiang Yang
    Zhigang Chen
    An Liu
    Lei Zhao
    Xiaofang Zhou
    World Wide Web, 2020, 23 : 2933 - 2956
  • [5] Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
    Ji, Guoliang
    Liu, Kang
    He, Shizhu
    Zhao, Jun
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3060 - 3066
  • [6] Distant supervision for relation extraction with hierarchical selective attention
    Zhou, Peng
    Xu, Jiaming
    Qi, Zhenyu
    Bao, Hongyun
    Chen, Zhineng
    Xu, Bo
    NEURAL NETWORKS, 2018, 108 : 240 - 247
  • [7] Distant Supervision for Relation Extraction with Hierarchical Attention and Entity Descriptions
    She, Heng
    Wu, Bin
    Wang, Bai
    Chi, Renjun
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [8] Distant Supervision for Relation Extraction via Piecewise Attention and Bag-Level Contextual Inference
    Phi, Van-Thuy
    Santoso, Joan
    Tran, Van-Hien
    Shindo, Hiroyuki
    Shimbo, Masashi
    Matsumoto, Yuji
    IEEE ACCESS, 2019, 7 : 103570 - 103582
  • [9] Distant supervision for relation extraction with hierarchical attention-based networks
    Zhang, Jing
    Cao, Meilin
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 220
  • [10] Knowledge graph attention mechanism for distant supervision neural relation extraction
    Liu, Meizhen
    Zhou, Fengyu
    He, Jiakai
    Yan, Xiaohui
    KNOWLEDGE-BASED SYSTEMS, 2022, 256