Diversifying Search Results using Self-Attention Network

被引:22
|
作者
Qin, Xubo [2 ]
Dou, Zhicheng [1 ]
Wen, Ji-Rong [3 ,4 ]
机构
[1] Renmin Univ China, Gaoling Sch Artificial Intelligence, Beijing, Peoples R China
[2] Renmin Univ China, Sch Informat, Beijing, Peoples R China
[3] Beijing Key Lab Big Data Management & Anal Method, Beijing, Peoples R China
[4] MOE, Key Lab Data Engn & Knowledge Engn, Beijing, Peoples R China
关键词
Search Result Diversification; Self Attention;
D O I
10.1145/3340531.3411914
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Search results returned by search engines need to be diversified in order to satisfy different information needs of different users. Several supervised learning models have been proposed for diversifying search results in recent years. Most of the existing supervised methods greedily compare each candidate document with the selected document sequence and select the next local optimal document. However, the information utility of each candidate document is not independent with each other, and research has shown that the selection of a candidate document will affect the utilities of other candidate documents. As a result, the local optimal document rankings will not lead to the global optimal rankings. In this paper, we propose a new supervised diversification framework to address this issue. Based on a self-attention encoder-decoder structure, the model can take the whole candidate document sequence as input, and simultaneously leverage both the novelty and the subtopic coverage of the candidate documents. We call this framework Diversity Encoder with Self-Attention (DESA). Comparing with existing supervised methods, this framework can model the interactions between all candidate documents and return their diversification scores based on the whole candidate document sequence. Experimental results show that our proposed framework outperforms existing methods. These results confirm the effectiveness of modeling all the candidate documents for the overall novelty and subtopic coverage globally, instead of comparing every single candidate document with the selected sequence document selection.
引用
收藏
页码:1265 / 1274
页数:10
相关论文
共 50 条
  • [1] Multilayer self-attention residual network for code search
    Hu, Haize
    Liu, Jianxun
    Zhang, Xiangping
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2023, 35 (09):
  • [2] GLOW : Global Weighted Self-Attention Network for Web Search
    Shan, Xuan
    Liu, Chuanjie
    Xia, Yiqian
    Chen, Qi
    Zhang, Yusi
    Ding, Kaize
    Liang, Yaobo
    Luo, Angen
    Luo, Yuxiang
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 519 - 528
  • [3] A mutual embedded self-attention network model for code search?
    Hu, Haize
    Liu, Jianxun
    Zhang, Xiangping
    Cao, Ben
    Cheng, Siqiang
    Long, Teng
    [J]. JOURNAL OF SYSTEMS AND SOFTWARE, 2023, 198
  • [4] The function of the self-attention network
    Cunningham, Sheila J.
    [J]. COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 21 - 22
  • [5] GDESA: Greedy Diversity Encoder with Self-attention for Search Results Diversification
    Qin, Xubo
    Dou, Zhicheng
    Zhu, Yutao
    Wen, Ji-Rong
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (02)
  • [6] Self-Attention Networks for Code Search
    Fang, Sen
    Tan, You-Shuai
    Zhang, Tao
    Liu, Yepang
    [J]. INFORMATION AND SOFTWARE TECHNOLOGY, 2021, 134
  • [7] Self-attention Hypergraph Pooling Network
    Zhao Y.-F.
    Jin F.-S.
    Li R.-H.
    Qin H.-C.
    Cui P.
    Wang G.-R.
    [J]. Ruan Jian Xue Bao/Journal of Software, 2023, 34 (10):
  • [8] Relevance, valence, and the self-attention network
    Mattan, Bradley D.
    Quinn, Kimberly A.
    Rotshtein, Pia
    [J]. COGNITIVE NEUROSCIENCE, 2016, 7 (1-4) : 27 - 28
  • [9] A self-attention network for smoke detection
    Jiang, Minghua
    Zhao, Yaxin
    Yu, Feng
    Zhou, Changlong
    Peng, Tao
    [J]. FIRE SAFETY JOURNAL, 2022, 129
  • [10] Dialogue Generation Using Self-Attention Generative Adversarial Network
    Hatua, Amartya
    Nguyen, Trung T.
    Sung, Andrew H.
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON CONVERSATIONAL DATA & KNOWLEDGE ENGINEERING (CDKE), 2019, : 33 - 38