Two Training Strategies for Improving Relation Extraction over Universal Graph

被引:0
|
作者
Dai, Qin [1 ]
Inoue, Naoya [2 ]
Takahashi, Ryo [1 ,3 ]
Inui, Kentaro [1 ,3 ]
机构
[1] Tohoku Univ, Sendai, Miyagi, Japan
[2] SUNY Stony Brook, Stony Brook, NY 11794 USA
[3] RIKEN, Ctr Adv Intelligence Project, Tokyo, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper explores how the Distantly Supervised Relation Extraction (DS-RE) can benefit from the use of a Universal Graph (UG), the combination of a Knowledge Graph (KG) and a large-scale text collection. A straightforward extension of a current state-of-the-art neural model for DS-RE with a UG may lead to degradation in performance. We first report that this degradation is associated with the difficulty in learning a UG and then propose two training strategies: (1) Path Type Adaptive Pretraining, which sequentially trains the model with different types of UG paths so as to prevent the reliance on a single type of UG path; and (2) Complexity Ranking Guided Attention mechanism, which restricts the attention span according to the complexity of a UG path so as to force the model to extract features not only from simple UG paths but also from complex ones. Experimental results on both biomedical and NYT10 datasets prove the robustness of our methods and achieve a new state-ofthe-art result on the NYT10 dataset. The code and datasets used in this paper are available at https://github. com/baodaiqin/UGDSRE.
引用
收藏
页码:3673 / 3684
页数:12
相关论文
共 50 条
  • [31] Improving Relation Extraction by Knowledge Representation Learning
    Hong, Wenxing
    Li, Shuyan
    Hu, Zhiqiang
    Rasool, Abdur
    Jiang, Qingshan
    Weng, Yang
    2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 1211 - 1215
  • [32] Improving Relation Extraction with Knowledge-attention
    Li, Pengfei
    Mao, Kezhi
    Yang, Xuefeng
    Li, Qi
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 229 - 239
  • [33] Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction
    Su, Peng
    Vijay-Shanker, K.
    BMC BIOINFORMATICS, 2022, 23 (01)
  • [34] Investigation of improving the pre-training and fine-tuning of BERT model for biomedical relation extraction
    Peng Su
    K. Vijay-Shanker
    BMC Bioinformatics, 23
  • [35] Extraction of conceptual relation based on HowNet and concept graph
    Liu, Hengwei
    Zhang, Lei
    Yang, Jing
    2012 4TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS (IHMSC), VOL 2, 2012, : 288 - 291
  • [36] A Weighted Diffusion Graph Convolutional Network for Relation Extraction
    Chen, Jiusheng
    Li, Zhenlin
    Yu, Hang
    Zhang, Xiaoyu
    JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING, 2024, 2024
  • [37] Attention Guided Graph Convolutional Networks for Relation Extraction
    Guo, Zhijiang
    Zhang, Yan
    Lu, Wei
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 241 - 251
  • [38] A Hybrid Graph Model for Distant Supervision Relation Extraction
    Duan, Shangfu
    Gao, Huan
    Liu, Bing
    Qi, Guilin
    SEMANTIC WEB, ESWC 2019, 2019, 11503 : 36 - 51
  • [39] Relation Extraction from Clinical Cases for a Knowledge Graph
    Savary, Agata
    Silvanovich, Alena
    Minard, Anne-Lyse
    Hiot, Nicolas
    Ferrari, Mirian Halfeld
    NEW TRENDS IN DATABASE AND INFORMATION SYSTEMS, ADBIS 2022, 2022, 1652 : 353 - 365
  • [40] Graph Neural Networks with Generated Parameters for Relation Extraction
    Zhu, Hao
    Lin, Yankai
    Liu, Zhiyuan
    Fu, Jie
    Chua, Tat-seng
    Sun, Maosong
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1331 - 1339