Neural Attentional Relation Extraction with Dual Dependency Trees

被引:3
|
作者
Li, Dong [1 ]
Lei, Zhi-Lei [1 ]
Song, Bao-Yan [1 ]
Ji, Wan-Ting [1 ]
Kou, Yue [2 ]
机构
[1] Liaoning Univ, Sch Informat, Shenyang 110036, Peoples R China
[2] Northeastern Univ, Sch Comp Sci & Engn, Shenyang 110004, Peoples R China
关键词
relation extraction; graph convolutional network (GCN); syntactic dependency tree; semantic dependency tree; NETWORKS;
D O I
10.1007/s11390-022-2420-2
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Relation extraction has been widely used to find semantic relations between entities from plain text. Dependency trees provide deeper semantic information for relation extraction. However, existing dependency tree based models adopt pruning strategies that are too aggressive or conservative, leading to insufficient semantic information or excessive noise in relation extraction models. To overcome this issue, we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees (called DDT-REM), which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features, respectively. Specifically, we first propose novel representation learning to capture the dependency relations from both syntax and semantics. Second, for the syntactic dependency tree, we propose a local-global attention mechanism to solve semantic deficits. We design an extension of graph convolutional networks (GCNs) to perform relation extraction, which effectively improves the extraction accuracy. We conduct experimental studies based on three real-world datasets. Compared with the traditional methods, our method improves the F1 scores by 0.3, 0.1 and 1.6 on three real-world datasets, respectively.
引用
收藏
页码:1369 / 1381
页数:13
相关论文
共 50 条
  • [21] Neural relation extraction: a review
    Aydar, Mehmet
    Bozal, Ozge
    Ozbay, Furkan
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2021, 29 (02) : 1029 - 1043
  • [22] A survey on neural relation extraction
    LIU Kang
    Science China(Technological Sciences), 2020, 63 (10) : 1971 - 1989
  • [23] Dependency Tree Positional Encoding Method for Relation Extraction
    Cho, Chunghyeon
    Choi, Yong Suk
    36TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2021, 2021, : 1012 - 1020
  • [24] The Chinese Open Relation Extraction Based on Dependency Parsing
    Wang Yuzhao
    Yang Yunfei
    Zhao Ruixue
    PROCEEDINGS OF THE 2017 5TH INTERNATIONAL CONFERENCE ON FRONTIERS OF MANUFACTURING SCIENCE AND MEASURING TECHNOLOGY (FMSMT 2017), 2017, 130 : 1212 - 1216
  • [25] Exploring Correlation of Dependency Relation Paths for Answer Extraction
    Shen, Dan
    Klakow, Dietrich
    COLING/ACL 2006, VOLS 1 AND 2, PROCEEDINGS OF THE CONFERENCE, 2006, : 889 - 896
  • [26] Dependency Parsing Based Chinese Open Relation Extraction
    Xu, Jing
    Gan, Liang
    Deng, Lu
    Wang, Jing
    Yan, Zhou
    PROCEEDINGS OF 2015 4TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT 2015), 2015, : 552 - 556
  • [27] A Dependency-Based Neural Network for Relation Classification
    Liu, Yang
    Wei, Furu
    Li, Sujian
    Ji, Heng
    Zhou, Ming
    Wang, Houfeng
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL) AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (IJCNLP), VOL 2, 2015, : 285 - 290
  • [28] Opinion Extraction on Topic Based News Reviews Using Dependency Trees
    Ju, Jiupeng
    Luo, Kan
    Zhou, Guodong
    11TH CHINESE LEXICAL SEMANTICS WORKSHOP (CKSW2010), 2010, : 447 - 453
  • [29] Dual attentional transformer for video visual relation prediction q
    Qu, Mingcheng
    Deng, Ganlin
    Di, Donglin
    Cui, Jianxun
    Su, Tonghua
    NEUROCOMPUTING, 2023, 550
  • [30] Employing Lexicalized Dependency Paths for Active Learning of Relation Extraction
    Sun, Huiyu
    Grishman, Ralph
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2022, 34 (03): : 1415 - 1423