Retrieval-Enhanced Generative Model for Large-Scale Knowledge Graph Completion

被引:0
|
作者
Yu, Donghan [1 ]
Yang, Yiming [1 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
关键词
knowledge graph completion; neural network; information retrieval;
D O I
10.1145/3539618.3592052
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The task of knowledge graph completion (KGC) is of great importance. To achieve scalability when dealing with large-scale knowledge graphs, recent works formulate KGC as a sequence-to-sequence process, where the incomplete triplet (input) and the missing entity (output) are both verbalized as text sequences. However, inference with these methods relies solely on the model parameters for implicit reasoning and neglects the use of KG itself, which limits the performance since the model lacks the capacity to memorize a vast number of triplets. To tackle this issue, we introduce ReSKGC, a Retrieval-enhanced Seq2seq KGC model, which selects semantically relevant triplets from the KG and uses them as evidence to guide output generation with explicit reasoning. Our method has demonstrated state-of-the-art performance on benchmark datasets Wikidata5M and WikiKG90Mv2, which contain about 5M and 90M entities, respectively.
引用
收藏
页码:2334 / 2338
页数:5
相关论文
共 50 条
  • [1] Contrast then Memorize: Semantic Neighbor Retrieval-Enhanced Inductive Multimodal Knowledge Graph Completion
    Zhao, Yu
    Zhang, Ying
    Zhou, Baohang
    Qian, Xinying
    Song, Kehui
    Cai, Xiangrui
    [J]. PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 102 - 111
  • [2] Generative adversarial meta-learning knowledge graph completion for large-scale complex knowledge graphs
    Tong, Weiming
    Chu, Xu
    Li, Zhongwei
    Tan, Liguo
    Zhao, Jinxiao
    Pan, Feng
    [J]. JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2024, : 1685 - 1701
  • [3] Anytime bottom-up rule learning for large-scale knowledge graph completion
    Christian Meilicke
    Melisachew Wudage Chekol
    Patrick Betz
    Manuel Fink
    Heiner Stuckeschmidt
    [J]. The VLDB Journal, 2024, 33 : 131 - 161
  • [4] Anytime bottom-up rule learning for large-scale knowledge graph completion
    Meilicke, Christian
    Chekol, Melisachew Wudage
    Betz, Patrick
    Fink, Manuel
    Stuckeschmidt, Heiner
    [J]. VLDB JOURNAL, 2024, 33 (01): : 131 - 161
  • [5] Using Pairwise Occurrence Information to Improve Knowledge Graph Completion on Large-Scale Datasets
    Balkir, Esma
    Naslidnyk, Masha
    Palfrey, Dave
    Mittal, Arpit
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3591 - 3596
  • [6] Large-Scale Graph Mining and Learning for Information Retrieval
    Gao, Bin
    Wang, Taifeng
    Liu, Tie-Yan
    [J]. SIGIR 2012: PROCEEDINGS OF THE 35TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2012, : 1194 - 1195
  • [7] Large-scale knowledge graph representation learning
    Badrouni, Marwa
    Katar, Chaker
    Inoubli, Wissem
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (09) : 5479 - 5499
  • [8] ASER: A Large-scale Eventuality Knowledge Graph
    Zhang, Hongming
    Liu, Xin
    Pan, Haojie
    Song, Yangqiu
    Leung, Cane Wing-Ki
    [J]. WEB CONFERENCE 2020: PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE (WWW 2020), 2020, : 201 - 211
  • [9] Distilling the Knowledge of Large-scale Generative Models into Retrieval Models for Efficient Open-domain Conversation
    Kim, Beomsu
    Seo, Seokjun
    Han, Seungju
    Erdenee, Enkhbayar
    Chang, Buru
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 3357 - 3373
  • [10] ROMO: Retrieval-enhanced Offline Model-based Optimization
    Chen, Mingcheng
    Zhao, Haoran
    Zhao, Yuxiang
    Fan, Hulei
    Gao, Hongqiao
    Yu, Yong
    Tian, Zheng
    [J]. 2023 5TH INTERNATIONAL CONFERENCE ON DISTRIBUTED ARTIFICIAL INTELLIGENCE, DAI 2023, 2023,