Enhanced Self-Attention Mechanism for Long and Short Term Sequential Recommendation Models

被引:2
|
作者
Zheng, Xiaoyao [1 ,2 ]
Li, Xingwang [1 ,2 ]
Chen, Zhenghua [1 ,2 ,3 ]
Sun, Liping [1 ,2 ]
Yu, Qingying [1 ,2 ]
Guo, Liangmin [1 ,2 ]
Luo, Yonglong [1 ,2 ]
机构
[1] Anhui Normal Univ, Anhui Prov Key Lab Network & Informat Secur, Wuhu 241002, Peoples R China
[2] Anhui Normal Univ, Sch Comp & Informat, Wuhu 241002, Peoples R China
[3] Agcy Sci Technol Infocomm Res I2R & Res ASTAR, Inst Infocomm Res, Singapore 138632, Singapore
关键词
Sequential recommendation; enhanced self-attention mechanism; gated recurrent unit; position weight; NETWORK;
D O I
10.1109/TETCI.2024.3366771
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Compared with traditional recommendation algorithms based on collaborative filtering and content, the sequential recommendation can better capture changes in user interests and recommend items that may be interacted with by the next time according to the user's historical interaction behaviors. Generally, there are several traditional methods for sequential recommendation: Markov Chain (MC) and Deep Neutral Network (DNN), both of which ignore the relationship between various behaviors and the dynamic changes of user interest in items over time. Furthermore, the early research methods usually deal with the user's historical interaction behavior in chronological order, which may cause the loss of partial preference information. According to the perspective that user preferences will change over time, this paper proposes a long and short-term sequential recommendation model with the enhanced self-attention network, RP-SANRec. The short-term intent module of RP-SANRec uses the Gated Recurrent Unit (GRU) to learn the comprehensive historical interaction sequence of the user to calculate the position weight information in the time order, which can be used to enhance the input of the self-attention mechanism. The long-term module captures the user's preferences through a bidirectional long and short-term memory network (Bi-LSTM). Finally, the user's dynamic interests and general preferences are fused, and the following recommendation result is predicted. This article applies the RP-SANRec model to three different public datasets under two evaluation indicators of HR@10 and NDCG@10. The extensive experiments proved that our proposed RP-SANRec model performs better than existing models.
引用
收藏
页码:2457 / 2466
页数:10
相关论文
共 50 条
  • [1] Long- and short-term self-attention network for sequential recommendation
    Xu, Chengfeng
    Feng, Jian
    Zhao, Pengpeng
    Zhuang, Fuzhen
    Wang, Deqing
    Liu, Yanchi
    Sheng, Victor S.
    NEUROCOMPUTING, 2021, 423 : 580 - 589
  • [2] Sequential Recommendation Based on Long-Term and Short-Term User Behavior with Self-attention
    Wei, Xing
    Zuo, Xianglin
    Yang, Bo
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2019, PT I, 2019, 11775 : 72 - 83
  • [3] Sequential Prediction of Glycosylated Hemoglobin Based on Long Short-Term Memory with Self-Attention Mechanism
    Wang, Xiaojia
    Gong, Wenqing
    Zhu, Keyu
    Yao, Lushi
    Zhang, Shanshan
    Xu, Weiqun
    Guan, Yuxiang
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2020, 13 (01) : 1578 - 1589
  • [4] Sequential Prediction of Glycosylated Hemoglobin Based on Long Short-Term Memory with Self-Attention Mechanism
    Xiaojia Wang
    Wenqing Gong
    Keyu Zhu
    Lushi Yao
    Shanshan Zhang
    Weiqun Xu
    Yuxiang Guan
    International Journal of Computational Intelligence Systems, 2020, 13 : 1578 - 1589
  • [5] Sequential Recommendation via Stochastic Self-Attention
    Fan, Ziwei
    Liu, Zhiwei
    Wang, Yu
    Wang, Alice
    Nazari, Zahra
    Zheng, Lei
    Peng, Hao
    Yu, Philip S.
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2036 - 2047
  • [6] Variational Self-attention Network for Sequential Recommendation
    Zhao, Jing
    Zhao, Pengpeng
    Zhao, Lei
    Liu, Yanchi
    Sheng, Victor S.
    Zhou, Xiaofang
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 1559 - 1570
  • [7] HSA: Hyperbolic Self-attention for Sequential Recommendation
    Hou, Peizhong
    Wang, Haiyang
    Li, Tianming
    Yan, Junchi
    WEB AND BIG DATA, PT III, APWEB-WAIM 2023, 2024, 14333 : 250 - 264
  • [8] An Improved Sequential Recommendation Algorithm based on Short-Sequence Enhancement and Temporal Self-Attention Mechanism
    Ni, Jianjun
    Tang, Guangyi
    Shen, Tong
    Cai, Yu
    Cao, Weidong
    COMPLEXITY, 2022, 2022
  • [9] Review-Enhanced Sequential Recommendation with Self-Attention and Graph Collaborative Features
    Hong, Yunqi
    Ye, Wei
    2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 1493 - 1499
  • [10] FISSA: Fusing Item Similarity Models with Self-Attention Networks for Sequential Recommendation
    Lin, Jing
    Pan, Weike
    Ming, Zhong
    RECSYS 2020: 14TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, 2020, : 130 - 139