Two Training Strategies for Improving Relation Extraction over Universal Graph

被引:0
|
作者
Dai, Qin [1 ]
Inoue, Naoya [2 ]
Takahashi, Ryo [1 ,3 ]
Inui, Kentaro [1 ,3 ]
机构
[1] Tohoku Univ, Sendai, Miyagi, Japan
[2] SUNY Stony Brook, Stony Brook, NY 11794 USA
[3] RIKEN, Ctr Adv Intelligence Project, Tokyo, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper explores how the Distantly Supervised Relation Extraction (DS-RE) can benefit from the use of a Universal Graph (UG), the combination of a Knowledge Graph (KG) and a large-scale text collection. A straightforward extension of a current state-of-the-art neural model for DS-RE with a UG may lead to degradation in performance. We first report that this degradation is associated with the difficulty in learning a UG and then propose two training strategies: (1) Path Type Adaptive Pretraining, which sequentially trains the model with different types of UG paths so as to prevent the reliance on a single type of UG path; and (2) Complexity Ranking Guided Attention mechanism, which restricts the attention span according to the complexity of a UG path so as to force the model to extract features not only from simple UG paths but also from complex ones. Experimental results on both biomedical and NYT10 datasets prove the robustness of our methods and achieve a new state-ofthe-art result on the NYT10 dataset. The code and datasets used in this paper are available at https://github. com/baodaiqin/UGDSRE.
引用
收藏
页码:3673 / 3684
页数:12
相关论文
共 50 条
  • [21] Multimodal Relation Extraction with Efficient Graph Alignment
    Zheng, Changmeng
    Feng, Junhao
    Fu, Ze
    Cai, Yi
    Li, Qing
    Wang, Tao
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 5298 - 5306
  • [22] Graph Convolutional Networks for Chemical Relation Extraction
    Mahendran, Darshini
    Tang, Christina
    McInnes, Bridget T.
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 833 - 842
  • [23] Improving Relation Extraction through Syntax-induced Pre-training with Dependency Masking
    Tian, Yuanhe
    Song, Yan
    Xia, Fei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 1875 - 1886
  • [24] Improved strategies of relation extraction based on graph convolutional model on tree structure for web information processing
    Yang, Shuo
    Guo, Jingzhi
    JOURNAL OF INDUSTRIAL INFORMATION INTEGRATION, 2022, 25
  • [25] Adversarial training for supervised relation extraction
    Yu, Yanhua
    He, Kanghao
    Li, Jie
    TSINGHUA SCIENCE AND TECHNOLOGY, 2022, 27 (03) : 610 - 618
  • [26] Adversarial Training for Supervised Relation Extraction
    Yanhua Yu
    Kanghao He
    Jie Li
    TsinghuaScienceandTechnology, 2022, 27 (03) : 610 - 618
  • [27] UNIVERSAL MILITARY TRAINING: TWO CONSIDERATIONS
    Sisson, Edward O.
    SCHOOL AND SOCIETY, 1945, 61 (1586): : 332 - 332
  • [28] Document-level relation extraction with two-stage dynamic graph attention networks
    Sun, Qi
    Zhang, Kun
    Huang, Kun
    Xu, Tiancheng
    Li, Xun
    Liu, Yaodi
    KNOWLEDGE-BASED SYSTEMS, 2023, 267
  • [29] Graph reasoning over explicit semantic relation
    Zhu, Tianyou
    Liu, Shi
    Li, Bo
    Liu, Junjian
    Liu, Pufan
    Zheng, Fei
    HIGH-CONFIDENCE COMPUTING, 2024, 4 (02):
  • [30] Improving the extraction and expansion method for large graph coloring
    Hao, Jin-Kao
    Wu, Qinghua
    DISCRETE APPLIED MATHEMATICS, 2012, 160 (16-17) : 2397 - 2407