Combining Multi-Head Attention and Sparse Multi-Head Attention Networks for Session-Based Recommendation

被引:1
|
作者
Zhao, Zhiwei [1 ]
Wang, Xiaoye [1 ]
Xiao, Yingyuan [1 ]
机构
[1] Tianjin Univ Technol, Sch Comp Sci & Engn, Tianjin, Peoples R China
关键词
session-based recommendation; multi-head attention; sparse multi-head attention;
D O I
10.1109/IJCNN54540.2023.10191924
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The core of session-based recommendation is to predict the next interactive item based on a set of anonymous user temporal or specified behavior sequences (e.g., click, browse or purchase item sequence), which is a key task of many online services today. Recently, self-attention networks have achieved remarkable success in the task of session-based recommendation. However, in session-based recommendation, some items may be clicked by mistake, and most of the current attention mechanisms assign weights to these items, resulting in the disadvantage of distraction. Although sparse attention networks can address the aforementioned issues, solely relying on sparse attention may in turn reduce the weight of some real-intent clicked items. Therefore, this paper proposes a model that combines multi-headed attention network and sparse multi-headed attention network, referred to as CMAN, which solves the drawback of assigning weights to items clicked by mistake in the traditional attention mechanism. And also prevents the drawback of reducing the weights of items that are truly clicked by some users brought by using sparse attention mechanism alone to some extent. Experiments on two real datasets show that the model outperforms some state-of-the-art models.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Enhanced Multi-Head Self-Attention Graph Neural Networks for Session-based Recommendation
    Pan, Wenhao
    Yang, Kai
    [J]. ENGINEERING LETTERS, 2022, 30 (01) : 37 - 44
  • [2] Session-based recommendation: Learning multi-dimension interests via a multi-head attention graph neural network
    Chen, Yao
    Xiong, Qi
    Guo, Yina
    [J]. APPLIED SOFT COMPUTING, 2022, 131
  • [3] On the diversity of multi-head attention
    Li, Jian
    Wang, Xing
    Tu, Zhaopeng
    Lyu, Michael R.
    [J]. NEUROCOMPUTING, 2021, 454 : 14 - 24
  • [4] Improving Multi-head Attention with Capsule Networks
    Gu, Shuhao
    Feng, Yang
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 314 - 326
  • [5] Hybrid graph convolutional networks with multi-head attention for location recommendation
    Zhong, Ting
    Zhang, Shengming
    Zhou, Fan
    Zhang, Kunpeng
    Trajcevski, Goce
    Wu, Jin
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2020, 23 (06): : 3125 - 3151
  • [6] Hybrid graph convolutional networks with multi-head attention for location recommendation
    Ting Zhong
    Shengming Zhang
    Fan Zhou
    Kunpeng Zhang
    Goce Trajcevski
    Jin Wu
    [J]. World Wide Web, 2020, 23 : 3125 - 3151
  • [7] Multi-head multi-order graph attention networks
    Ben, Jie
    Sun, Qiguo
    Liu, Keyu
    Yang, Xibei
    Zhang, Fengjun
    [J]. APPLIED INTELLIGENCE, 2024, 54 (17-18) : 8092 - 8107
  • [8] Acoustic Scene Analysis with Multi-head Attention Networks
    Wang, Weimin
    Wang, Weiran
    Sun, Ming
    Wang, Chao
    [J]. INTERSPEECH 2020, 2020, : 1191 - 1195
  • [9] Neural News Recommendation with Multi-Head Self-Attention
    Wu, Chuhan
    Wu, Fangzhao
    Ge, Suyu
    Qi, Tao
    Huang, Yongfeng
    Xie, Xing
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6389 - 6394
  • [10] Leveraging mixed distribution of multi-head attention for sequential recommendation
    Yihao Zhang
    Xiaoyang Liu
    [J]. Applied Intelligence, 2023, 53 : 454 - 469