Distant supervision for relation extraction with hierarchical attention-based networks

被引:5
|
作者
Zhang, Jing [1 ]
Cao, Meilin [2 ]
机构
[1] Southeast Univ, Sch Cyber Sci & Engn, 2 SEU Rd, Nanjing 211189, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, 200 Xiaolingwei St, Nanjing 210094, Peoples R China
基金
中国国家自然科学基金;
关键词
Distant supervision; Relation extraction; Multi-instance learning; Attention mechanism;
D O I
10.1016/j.eswa.2023.119727
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distant supervision employs external knowledge bases to automatically label corpora. The labeled sentences in a corpus are usually packaged and trained for relation extraction using a multi-instance learning paradigm. The automated distant supervision inevitably introduces label noises. Previous studies that used sentence-level attention mechanisms to de-noise neither considered correlation among sentences in a bag nor correlation among bags. As a result, a large amount of effective supervision information is lost, which will affect the performance of learned relation extraction models. Moreover, these methods ignore the lack of feature information in the few-sentence bags (especially the one-sentence bags). To address these issues, this paper proposes hierarchical attention-based networks that can de-noise at both sentence and bag levels. In the calculation of bag representation, we provide weights to sentence representations using sentence-level attention that considers correlations among sentences in each bag. Then, we employ bag-level attention to combine the similar bags by considering their correlations, which can enhance the feature of target bags with poor feature information, and to provide properer weights in the calculation of bag group representation. Both sentence-level attention and bag-level attention can make full use of supervised information to improve model performance. The proposed method was compared with nine state-of-the-art methods on the New York Times datasets and Google IISc Distant Supervision dataset, respectively, whose experimental results show its conspicuous advantages in relation extraction tasks.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Distant supervision for relation extraction with hierarchical selective attention
    Zhou, Peng
    Xu, Jiaming
    Qi, Zhenyu
    Bao, Hongyun
    Chen, Zhineng
    Xu, Bo
    NEURAL NETWORKS, 2018, 108 : 240 - 247
  • [2] Hybrid attention-based transformer block model for distant supervision relation extraction
    Xiao, Yan
    Jin, Yaochu
    Cheng, Ran
    Hao, Kuangrong
    NEUROCOMPUTING, 2022, 470 : 29 - 39
  • [3] Distant Supervision for Relation Extraction with Hierarchical Attention and Entity Descriptions
    She, Heng
    Wu, Bin
    Wang, Bai
    Chi, Renjun
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [4] Chemical-induced disease relation extraction via attention-based distant supervision
    Gu, Jinghang
    Sun, Fuqing
    Qian, Longhua
    Zhou, Guodong
    BMC BIOINFORMATICS, 2019, 20 (1)
  • [5] Chemical-induced disease relation extraction via attention-based distant supervision
    Jinghang Gu
    Fuqing Sun
    Longhua Qian
    Guodong Zhou
    BMC Bioinformatics, 20
  • [6] Attention-Based Gated Convolutional Neural Networks for Distant Supervised Relation Extraction
    Li, Xingya
    Chen, Yufeng
    Xu, Jinan
    Zhang, Yujie
    CHINESE COMPUTATIONAL LINGUISTICS, CCL 2019, 2019, 11856 : 246 - 257
  • [7] Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction
    Weijiang Li
    Qing Wang
    Jiekun Wu
    Zhengtao Yu
    Applied Intelligence, 2022, 52 : 4599 - 4609
  • [8] Piecewise convolutional neural networks with position attention and similar bag attention for distant supervision relation extraction
    Li, Weijiang
    Wang, Qing
    Wu, Jiekun
    Yu, Zhengtao
    APPLIED INTELLIGENCE, 2022, 52 (04) : 4599 - 4609
  • [9] Attention-Based Capsule Networks with Dynamic Routing for Relation Extraction
    Zhang, Ningyu
    Deng, Shumin
    Sun, Zhanlin
    Chen, Xi
    Zhang, Wei
    Chen, Huajun
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 986 - 992
  • [10] Distant Supervised Relation Extraction with Hierarchical Attention Mechanism
    Liu, Jianyi
    Chen, Liandong
    Shi, Rui
    Xu, Jie
    Liu, An
    2021 THE 7TH INTERNATIONAL CONFERENCE ON COMMUNICATION AND INFORMATION PROCESSING, ICCIP 2021, 2021, : 44 - 50