Latent Graph Recurrent Network for Document Ranking

被引:4
|
作者
Dong, Qian [1 ,2 ]
Niu, Shuzi [1 ]
机构
[1] Chinese Acad Sci, Inst Software, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Ad hoc retrieval; Graph neural network; Transformer;
D O I
10.1007/978-3-030-73197-7_6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
BERT based ranking models are emerging for its superior natural language understanding ability. The attention matrix learned through BERT captures all the word relations in the input text. However, neural ranking models focus only on the text matching between query and document. To solve this problem, we propose a graph recurrent neural network based model to refine word representations from BERT for document ranking, referred to as Latent Graph Recurrent Network (LGRe for short). For each query and document pair, word representations are learned through transformer layer. Based on these word representations, we propose masking strategies to construct a bipartite-core word graph to model the matching between the query and document. Word representations will be further refined by graph recurrent neural network to enhance word relations in this graph. The final relevance score is computed from refined word representations through fully connected layers. Moreover, we propose a triangle distance loss function for embedding layers as an auxiliary task to obtain discriminative representations. It is optimized jointly with pairwise ranking loss for ad hoc document ranking task. Experimental results on public benchmark TREC Robust04 and WebTrack2009-12 test collections show that LGRe (The implementation is available at https://github.com/DQ0408/LGRe) outperforms state-of-the-art baselines more than 2%.
引用
收藏
页码:88 / 103
页数:16
相关论文
共 50 条
  • [31] Latent Words Recurrent Neural Network Language Models
    Masumura, Ryo
    Asami, Taichi
    Oba, Takanobu
    Masataki, Hirokazu
    Sakauchi, Sumitaka
    Ito, Akinori
    [J]. 16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 2380 - 2384
  • [32] Hierarchical state recurrent neural network for social emotion ranking
    Zhou, Deyu
    Zhang, Meng
    Yang, Yang
    He, Yulan
    [J]. COMPUTER SPEECH AND LANGUAGE, 2021, 68
  • [33] Edge-based graph neural network for ranking critical road segments in a network
    Jana, Debasish
    Malama, Sven
    Narasimhan, Sriram
    Taciroglu, Ertugrul
    [J]. PLOS ONE, 2023, 18 (12):
  • [34] Clustering by deep latent position model with graph convolutional network
    Liang, Dingge
    Corneli, Marco
    Bouveyron, Charles
    Latouche, Pierre
    [J]. ADVANCES IN DATA ANALYSIS AND CLASSIFICATION, 2024,
  • [35] Fuzzy graph based shortest path ranking method for optical network
    Adaikalam, A.
    Manikandan, S.
    Rajamani, V.
    [J]. OPTICAL AND QUANTUM ELECTRONICS, 2017, 49 (09)
  • [36] Fuzzy graph based shortest path ranking method for optical network
    A. Adaikalam
    S. Manikandan
    V. Rajamani
    [J]. Optical and Quantum Electronics, 2017, 49
  • [37] Cellular Network Traffic Prediction with Hybrid Graph Convolutional Recurrent Network
    Zhang, Miaoru
    Zhou, Hao
    Yu, Ke
    Wu, Xiaofei
    [J]. Wireless Personal Communications, 138 (03): : 1867 - 1892
  • [38] Learning Point Processes using Recurrent Graph Network
    Dash, Saurabh
    She, Xueyuan
    Mukhopadhyay, Saibal
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [39] RGFN: Recurrent Graph Feature Network for ClickBait Detection
    Wang, Youwei
    Zhang, Haoran
    Zhu, Jianming
    Li, Yang
    Feng, Lizhou
    [J]. 2021 INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE BIG DATA AND INTELLIGENT SYSTEMS (HPBD&IS), 2021, : 151 - 156
  • [40] A Tutorial on Quantum Graph Recurrent Neural Network (QGRNN)
    Choi, Jaeho
    Oh, Seunghyeok
    Kim, Joongheon
    [J]. 35TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING (ICOIN 2021), 2021, : 46 - 49