TENET: Joint Entity and Relation Linking with Coherence Relaxation

被引:0
|
作者
Lin, Xueling [1 ]
Chen, Lei [1 ]
Zhang, Chaorui [2 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Huawei Technol, Theory Lab, Hong Kong, Peoples R China
关键词
knowledge base; entity linking; relation linking; KNOWLEDGE; DISAMBIGUATION;
D O I
10.1145/3448016.3457280d
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The joint entity and relation linking task aims to connect the noun phrases (resp., relational phrases) extracted from natural language documents to the entities (resp., predicates) in general knowledge bases (KBs). This task benefits numerous downstream systems, such as question answering and KB population. Previous works on entity and relation linking rely on the global coherence assumption, i.e., entities and predicates within the same document are highly correlated with each other. However, this assumption is not always valid in many real-world scenarios. Due to KB incompleteness or data sparsity, sparse coherence among the entities and predicates within the same document is common. Moreover, there may exist isolated entities or predicates that are not related to any other linked concepts. In this paper, we propose TENET, a joint entity and relation linking technique, which relaxes the coherence assumption in an unsupervised manner. Specifically, we formulate the joint entity and relation linking task as a minimum-cost rooted tree cover problem on the knowledge coherence graph constructed based on the document. We then propose effective approximation algorithms with pruning strategies to solve this problem and derive the linking results. Extensive experiments on real-world datasets demonstrate the superior effectiveness and efficiency of our method against the state-of-the-art techniques.
引用
收藏
页码:1142 / 1155
页数:14
相关论文
共 50 条
  • [41] Joint Entity and Relation Extraction With Set Prediction Networks
    Sui, Dianbo
    Zeng, Xiangrong
    Chen, Yubo
    Liu, Kang
    Zhao, Jun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12784 - 12795
  • [42] An Easy Partition Approach for Joint Entity and Relation Extraction
    Hou, Jing
    Deng, Xiaomeng
    Han, Pengwu
    APPLIED SCIENCES-BASEL, 2023, 13 (13):
  • [43] A joint model for entity and relation extraction based on BERT
    Qiao, Bo
    Zou, Zhuoyang
    Huang, Yu
    Fang, Kui
    Zhu, Xinghui
    Chen, Yiming
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (05): : 3471 - 3481
  • [44] Attention Weight is Indispensable in Joint Entity and Relation Extraction
    Ouyang, Jianquan
    Zhang, Jing
    Liu, Tianming
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2022, 34 (03): : 1707 - 1723
  • [45] A novel entity joint annotation relation extraction model
    Xu, Meng
    Pi, Dechang
    Cao, Jianjun
    Yuan, Shuilian
    APPLIED INTELLIGENCE, 2022, 52 (11) : 12754 - 12770
  • [46] Joint Learning of Named Entity Recognition and Relation Extraction
    Xu, Qiuyan
    Li, Fang
    2011 INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND NETWORK TECHNOLOGY (ICCSNT), VOLS 1-4, 2012, : 1978 - 1982
  • [47] Boundary assembling method for joint entity and relation extraction
    Tang, Ruixue
    Chen, Yanping
    Qin, Yongbin
    Huang, Ruizhang
    Dong, Bo
    Zheng, Qinghua
    KNOWLEDGE-BASED SYSTEMS, 2022, 250
  • [48] Re-ranking for Joint Named-Entity Recognition and Linking
    Sil, Avirup
    Yates, Alexander
    PROCEEDINGS OF THE 22ND ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM'13), 2013, : 2369 - 2374
  • [49] A Piggyback System for Joint Entity Mention Detection and Linking in Web Queries
    Cornolti, Marco
    Ferragina, Paolo
    Ciaramita, Massimiliano
    Rued, Stefan
    Schuetze, Hinrich
    PROCEEDINGS OF THE 25TH INTERNATIONAL CONFERENCE ON WORLD WIDE WEB (WWW'16), 2016, : 567 - 578
  • [50] MMEL: A Joint Learning Framework for Multi-Mention Entity Linking
    Yang, Chengmei
    He, Bowei
    Wu, Yimeng
    Xing, Chao
    He, Lianghua
    Ma, Chen
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 2411 - 2421