TENET: Joint Entity and Relation Linking with Coherence Relaxation

被引:0
|
作者
Lin, Xueling [1 ]
Chen, Lei [1 ]
Zhang, Chaorui [2 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Huawei Technol, Theory Lab, Hong Kong, Peoples R China
关键词
knowledge base; entity linking; relation linking; KNOWLEDGE; DISAMBIGUATION;
D O I
10.1145/3448016.3457280d
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The joint entity and relation linking task aims to connect the noun phrases (resp., relational phrases) extracted from natural language documents to the entities (resp., predicates) in general knowledge bases (KBs). This task benefits numerous downstream systems, such as question answering and KB population. Previous works on entity and relation linking rely on the global coherence assumption, i.e., entities and predicates within the same document are highly correlated with each other. However, this assumption is not always valid in many real-world scenarios. Due to KB incompleteness or data sparsity, sparse coherence among the entities and predicates within the same document is common. Moreover, there may exist isolated entities or predicates that are not related to any other linked concepts. In this paper, we propose TENET, a joint entity and relation linking technique, which relaxes the coherence assumption in an unsupervised manner. Specifically, we formulate the joint entity and relation linking task as a minimum-cost rooted tree cover problem on the knowledge coherence graph constructed based on the document. We then propose effective approximation algorithms with pruning strategies to solve this problem and derive the linking results. Extensive experiments on real-world datasets demonstrate the superior effectiveness and efficiency of our method against the state-of-the-art techniques.
引用
收藏
页码:1142 / 1155
页数:14
相关论文
共 50 条
  • [31] Entity Factor: A Balanced Method for Table Filling in Joint Entity and Relation Extraction
    Liu, Zhifeng
    Tao, Mingcheng
    Zhou, Conghua
    ELECTRONICS, 2023, 12 (01)
  • [32] Boundary regression model for joint entity and relation extraction
    Tang, Ruixue
    Chen, Yanping
    Qin, Yongbin
    Huang, Ruizhang
    Zheng, Qinghua
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 229
  • [33] CyberRel: Joint Entity and Relation Extraction for Cybersecurity Concepts
    Guo, Yongyan
    Liu, Zhengyu
    Huang, Cheng
    Liu, Jiayong
    Jing, Wangyuan
    Wang, Ziwang
    Wang, Yanghao
    INFORMATION AND COMMUNICATIONS SECURITY (ICICS 2021), PT I, 2021, 12918 : 447 - 463
  • [34] Joint Semantic Relation Extraction for Multiple Entity Packets
    Shi, Yuncheng
    Wang, Jiahui
    Huang, Zehao
    Li, Shiyao
    Xue, Chengjie
    Yue, Kun
    WEB AND BIG DATA, APWEB-WAIM 2024, PT I, 2024, 14961 : 74 - 89
  • [35] Enhancing interaction representation for joint entity and relation extraction
    Tang, Ruixue
    Chen, Yanping
    Huang, Ruizhang
    Qin, Yongbin
    COGNITIVE SYSTEMS RESEARCH, 2023, 82
  • [36] Joint Entity and Relation Extraction Based on Reinforcement Learning
    Zhou, Xin
    Liu, Luping
    Luo, Xiaodong
    Chen, Haiqiang
    Qing, Linbo
    He, Xiaohai
    IEEE ACCESS, 2019, 7 : 125688 - 125699
  • [37] A novel entity joint annotation relation extraction model
    Meng Xu
    Dechang Pi
    Jianjun Cao
    Shuilian Yuan
    Applied Intelligence, 2022, 52 : 12754 - 12770
  • [38] Bootstrapping Joint Entity and Relation Extraction with Reinforcement Learning
    Xia, Min
    Cheng, Xiang
    Su, Sen
    Kuang, Ming
    Li, Gang
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2022, 2022, 13724 : 418 - 432
  • [39] A joint model for entity and relation extraction based on BERT
    Bo Qiao
    Zhuoyang Zou
    Yu Huang
    Kui Fang
    Xinghui Zhu
    Yiming Chen
    Neural Computing and Applications, 2022, 34 : 3471 - 3481
  • [40] A Partition Filter Network for Joint Entity and Relation Extraction
    Yan, Zhiheng
    Zhang, Chong
    Fu, Jinlan
    Zhang, Qi
    Wei, Zhongyu
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 185 - 197