Enhanced Self-Attention Mechanism for Long and Short Term Sequential Recommendation Models

被引:2
|
作者
Zheng, Xiaoyao [1 ,2 ]
Li, Xingwang [1 ,2 ]
Chen, Zhenghua [1 ,2 ,3 ]
Sun, Liping [1 ,2 ]
Yu, Qingying [1 ,2 ]
Guo, Liangmin [1 ,2 ]
Luo, Yonglong [1 ,2 ]
机构
[1] Anhui Normal Univ, Anhui Prov Key Lab Network & Informat Secur, Wuhu 241002, Peoples R China
[2] Anhui Normal Univ, Sch Comp & Informat, Wuhu 241002, Peoples R China
[3] Agcy Sci Technol Infocomm Res I2R & Res ASTAR, Inst Infocomm Res, Singapore 138632, Singapore
关键词
Sequential recommendation; enhanced self-attention mechanism; gated recurrent unit; position weight; NETWORK;
D O I
10.1109/TETCI.2024.3366771
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Compared with traditional recommendation algorithms based on collaborative filtering and content, the sequential recommendation can better capture changes in user interests and recommend items that may be interacted with by the next time according to the user's historical interaction behaviors. Generally, there are several traditional methods for sequential recommendation: Markov Chain (MC) and Deep Neutral Network (DNN), both of which ignore the relationship between various behaviors and the dynamic changes of user interest in items over time. Furthermore, the early research methods usually deal with the user's historical interaction behavior in chronological order, which may cause the loss of partial preference information. According to the perspective that user preferences will change over time, this paper proposes a long and short-term sequential recommendation model with the enhanced self-attention network, RP-SANRec. The short-term intent module of RP-SANRec uses the Gated Recurrent Unit (GRU) to learn the comprehensive historical interaction sequence of the user to calculate the position weight information in the time order, which can be used to enhance the input of the self-attention mechanism. The long-term module captures the user's preferences through a bidirectional long and short-term memory network (Bi-LSTM). Finally, the user's dynamic interests and general preferences are fused, and the following recommendation result is predicted. This article applies the RP-SANRec model to three different public datasets under two evaluation indicators of HR@10 and NDCG@10. The extensive experiments proved that our proposed RP-SANRec model performs better than existing models.
引用
收藏
页码:2457 / 2466
页数:10
相关论文
共 50 条
  • [21] Feature Interaction Dual Self-attention network for sequential recommendation
    Zhu, Yunfeng
    Yao, Shuchun
    Sun, Xun
    FRONTIERS IN NEUROROBOTICS, 2024, 18
  • [22] CSAN: Contextual Self-Attention Network for User Sequential Recommendation
    Huang, Xiaowen
    Qian, Shengsheng
    Fang, Quan
    Sang, Jitao
    Xu, Changsheng
    PROCEEDINGS OF THE 2018 ACM MULTIMEDIA CONFERENCE (MM'18), 2018, : 447 - 455
  • [23] Combining Non-sampling and Self-attention for Sequential Recommendation
    Chen, Guangjin
    Zhao, Guoshuai
    Zhu, Li
    Zhuo, Zhimin
    Qian, Xueming
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (02)
  • [24] Sequential Recommendation with Relation-Aware Kernelized Self-Attention
    Ji, Mingi
    Joo, Weonyoung
    Song, Kyungwoo
    Kim, Yoon-Yeong
    Moon, Il-Chul
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4304 - 4311
  • [25] Modeling Periodic Pattern with Self-Attention Network for Sequential Recommendation
    Ma, Jun
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Zhao, Lei
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2020), PT III, 2020, 12114 : 557 - 572
  • [26] A self-attention sequential model for long-term prediction of video streams
    Ge, Yunfeng
    Li, Hongyan
    Shi, Keyi
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (03): : 88 - 102
  • [27] Long- and short-term collaborative attention networks for sequential recommendation
    Yumin Dong
    Yongfu Zha
    Yongjian Zhang
    Xinji Zha
    The Journal of Supercomputing, 2023, 79 : 18375 - 18393
  • [28] Long- and short-term collaborative attention networks for sequential recommendation
    Dong, Yumin
    Zha, Yongfu
    Zhang, Yongjian
    Zha, Xinji
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (16): : 18375 - 18393
  • [29] FSASA: Sequential Recommendation Based on Fusing Session-Aware Models and Self-Attention Networks
    Guo, Shangzhi
    Liao, Xiaofeng
    Meng, Fei
    Zhao, Qing
    Tang, Yuling
    Li, Hui
    Zong, Qinqin
    COMPUTER SCIENCE AND INFORMATION SYSTEMS, 2024, 21 (01) : 1 - 20
  • [30] Time interval-Aware graph with self-Attention for sequential recommendation
    Institutes of Physical Science and Information Technology, Anhui University, Hefei
    230601, China
    ACM Int. Conf. Proc. Ser.,