Distant supervision for relation extraction with hierarchical attention-based networks

被引:5
|
作者
Zhang, Jing [1 ]
Cao, Meilin [2 ]
机构
[1] Southeast Univ, Sch Cyber Sci & Engn, 2 SEU Rd, Nanjing 211189, Peoples R China
[2] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, 200 Xiaolingwei St, Nanjing 210094, Peoples R China
基金
中国国家自然科学基金;
关键词
Distant supervision; Relation extraction; Multi-instance learning; Attention mechanism;
D O I
10.1016/j.eswa.2023.119727
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Distant supervision employs external knowledge bases to automatically label corpora. The labeled sentences in a corpus are usually packaged and trained for relation extraction using a multi-instance learning paradigm. The automated distant supervision inevitably introduces label noises. Previous studies that used sentence-level attention mechanisms to de-noise neither considered correlation among sentences in a bag nor correlation among bags. As a result, a large amount of effective supervision information is lost, which will affect the performance of learned relation extraction models. Moreover, these methods ignore the lack of feature information in the few-sentence bags (especially the one-sentence bags). To address these issues, this paper proposes hierarchical attention-based networks that can de-noise at both sentence and bag levels. In the calculation of bag representation, we provide weights to sentence representations using sentence-level attention that considers correlations among sentences in each bag. Then, we employ bag-level attention to combine the similar bags by considering their correlations, which can enhance the feature of target bags with poor feature information, and to provide properer weights in the calculation of bag group representation. Both sentence-level attention and bag-level attention can make full use of supervised information to improve model performance. The proposed method was compared with nine state-of-the-art methods on the New York Times datasets and Google IISc Distant Supervision dataset, respectively, whose experimental results show its conspicuous advantages in relation extraction tasks.
引用
收藏
页数:9
相关论文
共 50 条
  • [31] Biomedical Relation Extraction Using Distant Supervision
    Boudjellal, Nada
    Zhang, Huaping
    Khan, Asif
    Ahmad, Arshad
    SCIENTIFIC PROGRAMMING, 2020, 2020
  • [32] Distant Supervision for Relation Extraction with Matrix Completion
    Fan, Miao
    Zhao, Deli
    Zhou, Qiang
    Liu, Zhiyuan
    Zheng, Thomas Fang
    Chang, Edward Y.
    PROCEEDINGS OF THE 52ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2014, : 839 - 849
  • [33] Distant supervision of relation extraction in sparse data
    Ranjbar-Sahraei, Bijan
    Rahmani, Hossein
    Weiss, Gerhard
    Tuyls, Karl
    INTELLIGENT DATA ANALYSIS, 2019, 23 (05) : 1145 - 1166
  • [34] Distant Supervision for Relation Extraction with Gate Mechanism
    Li X.
    Chen Y.
    Xu J.
    Zhang Y.
    Beijing Daxue Xuebao (Ziran Kexue Ban)/Acta Scientiarum Naturalium Universitatis Pekinensis, 2020, 56 (01): : 39 - 44
  • [35] Relation extraction based on two-step classification with distant supervision
    Choi, Maengsik
    Lee, Hyeon-gu
    Kim, Harksoo
    JOURNAL OF SUPERCOMPUTING, 2016, 72 (07): : 2609 - 2622
  • [36] Relation Extraction with Temporal Reasoning Based on Memory Augmented Distant Supervision
    Yan, Jianhao
    He, Lin
    Huang, Ruqin
    Li, Jian
    Liu, Ying
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 1019 - 1030
  • [37] Relation extraction based on two-step classification with distant supervision
    Maengsik Choi
    Hyeon-gu Lee
    Harksoo Kim
    The Journal of Supercomputing, 2016, 72 : 2609 - 2622
  • [38] Relation extraction in Chinese using attention-based bidirectional long short- term networks
    Zhang, Yanzi
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [39] H-FND: Hierarchical False-Negative Denoising for Distant Supervision Relation Extraction
    Chen, Jhih-Wei
    Fu, Tsu-Jui
    Lee, Chen-Kang
    Ma, Wei-Yun
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2579 - 2593
  • [40] Self-selective attention using correlation between instances for distant supervision relation extraction
    Zhou, Yanru
    Pan, Limin
    Bai, Chongyou
    Luo, Senlin
    Wu, Zhouting
    NEURAL NETWORKS, 2021, 142 : 213 - 220