Relation Extraction using Multi-Encoder LSTM Network on a Distant Supervised Dataset

被引:7
|
作者
Banerjee, Siddhartha [1 ]
Tsioutsiouliklis, Kostas [1 ]
机构
[1] Yahoo, Sunnyvale, CA 94089 USA
关键词
D O I
10.1109/ICSC.2018.00040
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Relation extraction techniques are used to find potential relational facts in textual content. Relation Extraction systems require huge amount of training data to model semantics of sentences and identify relations. Distant supervision, often used to construct training data for relation extraction, produces noisy alignments that can hurt the performance of relation extraction systems. To this end, we propose a simple, yet effective, technique to automatically compute confidence levels of alignments. We compute the confidence values of automatically labeled content using co-occurrence statistics of relations and dependency patterns of aligned sentences. Thereafter, we propose a novel multi-encoder bidirectional Long Short Term Memory (LSTM) model to identify relations in a given sentence. We use different features (words, part-of-speech (POS) tags and dependency paths) in each encoder and concatenate the hidden states of all the encoders to predict the relations. Our experiments show that a multi-encoder network can handle different features together to predict relations more accurately (similar to 9% improvement over a single encoder model). We also conduct visualization experiments to show that our model learns intermediate representations effectively to identify relations in sentences.
引用
收藏
页码:235 / 238
页数:4
相关论文
共 50 条
  • [21] Multi-Encoder Context Aggregation Network for Structured and Unstructured Urban Street Scene Analysis
    Singha, Tanmay
    Pham, Duc-Son
    Krishna, Aneesh
    IEEE ACCESS, 2023, 11 : 66227 - 66244
  • [22] ESRE: handling repeated entities in distant supervised relation extraction
    Xin Sun
    Jinghu Jiang
    Yuming Shang
    Neural Computing and Applications, 2021, 33 : 11325 - 11337
  • [23] Improving Distant Supervised Relation Extraction with Noise Detection Strategy
    Meng, Xiaoyan
    Jiang, Tonghai
    Zhou, Xi
    Ma, Bo
    Wang, Yi
    Zhao, Fan
    APPLIED SCIENCES-BASEL, 2021, 11 (05): : 1 - 15
  • [24] Distant Supervised Relation Extraction with Cost-Sensitive Loss
    Zeng, Daojian
    Xiao, Yao
    Wang, Jin
    Dai, Yuan
    Sangaiah, Arun Kumar
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 60 (03): : 1251 - 1261
  • [25] Using Cost-Sensitive Ranking Loss to Improve Distant Supervised Relation Extraction
    Zeng, Daojian
    Zeng, Junxin
    Dai, Yuan
    CHINESE COMPUTATIONAL LINGUISTICS AND NATURAL LANGUAGE PROCESSING BASED ON NATURALLY ANNOTATED BIG DATA, CCL 2017, 2017, 10565 : 184 - 196
  • [26] Distant supervised relation extraction with label entailment and collaborative denoising
    Xie, Tingyu
    Li, Qi
    Wang, Gaoang
    Wang, Hongwei
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2025,
  • [27] ESRE: handling repeated entities in distant supervised relation extraction
    Sun, Xin
    Jiang, Jinghu
    Shang, Yuming
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (17): : 11325 - 11337
  • [28] Hierarchical symmetric cross entropy for distant supervised relation extraction
    Liu, Yun
    Jiang, Xiaoheng
    Lv, Pengshuai
    Lu, Yang
    Li, Shupan
    Zhang, Kunli
    Xu, Mingliang
    APPLIED INTELLIGENCE, 2024, 54 (21) : 11020 - 11033
  • [29] Effective Deep Memory Networks for Distant Supervised Relation Extraction
    Feng, Xiaocheng
    Guo, Jiang
    Qin, Bing
    Liu, Ting
    Liu, Yongjie
    PROCEEDINGS OF THE TWENTY-SIXTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 4002 - 4008
  • [30] A Distant Supervised Relation Extraction Model with Two Denoising Strategies
    Zhou, Zikai
    Cai, Yi
    Xu, Jingyun
    Xie, Jiayuan
    Li, Qing
    Xie, Haoran
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,