Latent Graph Recurrent Network for Document Ranking

被引:4
|
作者
Dong, Qian [1 ,2 ]
Niu, Shuzi [1 ]
机构
[1] Chinese Acad Sci, Inst Software, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Ad hoc retrieval; Graph neural network; Transformer;
D O I
10.1007/978-3-030-73197-7_6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
BERT based ranking models are emerging for its superior natural language understanding ability. The attention matrix learned through BERT captures all the word relations in the input text. However, neural ranking models focus only on the text matching between query and document. To solve this problem, we propose a graph recurrent neural network based model to refine word representations from BERT for document ranking, referred to as Latent Graph Recurrent Network (LGRe for short). For each query and document pair, word representations are learned through transformer layer. Based on these word representations, we propose masking strategies to construct a bipartite-core word graph to model the matching between the query and document. Word representations will be further refined by graph recurrent neural network to enhance word relations in this graph. The final relevance score is computed from refined word representations through fully connected layers. Moreover, we propose a triangle distance loss function for embedding layers as an auxiliary task to obtain discriminative representations. It is optimized jointly with pairwise ranking loss for ad hoc document ranking task. Experimental results on public benchmark TREC Robust04 and WebTrack2009-12 test collections show that LGRe (The implementation is available at https://github.com/DQ0408/LGRe) outperforms state-of-the-art baselines more than 2%.
引用
收藏
页码:88 / 103
页数:16
相关论文
共 50 条
  • [1] Disentangled Graph Recurrent Network for Document Ranking
    Dong, Qian
    Niu, Shuzi
    Yuan, Tao
    Li, Yucheng
    [J]. DATA SCIENCE AND ENGINEERING, 2022, 7 (01) : 30 - 43
  • [2] Disentangled Graph Recurrent Network for Document Ranking
    Qian Dong
    Shuzi Niu
    Tao Yuan
    Yucheng Li
    [J]. Data Science and Engineering, 2022, 7 : 30 - 43
  • [3] CGTR: Convolution Graph Topology Representation for Document Ranking
    Qi, Yuanyuan
    Zhang, Jiayue
    Liu, Yansong
    Xu, Weiran
    Guo, Jun
    [J]. CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2173 - 2176
  • [4] Probabilistic Latent Document Network Embedding
    Le, Tuan M. V.
    Lauw, Hady W.
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2014, : 270 - 279
  • [5] Ranking events based on event relation graph for a single document
    Zhong Z.
    Liu Z.
    [J]. Information Technology Journal, 2010, 9 (01) : 174 - 178
  • [6] Infinite Latent Feature Selection: A Probabilistic Latent Graph-Based Ranking Approach
    Roffo, Giorgio
    Melzi, Simone
    Castellani, Umberto
    Vinciarelli, Alessandro
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 1407 - 1415
  • [7] Socialformer: Social Network Inspired Long Document Modeling for Document Ranking
    Zhou, Yujia
    Dou, Zhicheng
    Yuan, Huaying
    Ma, Zhengyi
    [J]. PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 339 - 347
  • [8] Multi-Document Abstractive Summarization using Chunk-graph and Recurrent Neural Network
    Niu, Jianwei
    Chen, Huan
    Zhao, Qingjuan
    Sun, Limin
    Atiquzzaman, Mohammed
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2017,
  • [9] Signed Graph Neural Network with Latent Groups
    Liu, Haoxin
    Zhang, Ziwei
    Cui, Peng
    Zhang, Yafeng
    Cui, Qiang
    Liu, Jiashuo
    Zhu, Wenwu
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 1066 - 1075
  • [10] Graph Topic Neural Network for Document Representation
    Xie, Qianqian
    Huang, Jimin
    Du, Pan
    Peng, Min
    Nie, Jian-Yun
    [J]. PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 3055 - 3065