Sequential Recommendation via Stochastic Self-Attention

被引:59
|
作者
Fan, Ziwei [1 ,5 ]
Liu, Zhiwei [1 ]
Wang, Yu [1 ]
Wang, Alice [2 ]
Nazari, Zahra [2 ]
Zheng, Lei [3 ]
Peng, Hao [4 ]
Yu, Philip S. [1 ]
机构
[1] Univ Illinois, Dept Comp Sci, Chicago, IL 60680 USA
[2] Spotify, New York, NY USA
[3] Pinterest Inc, Chicago, IL USA
[4] Beihang Univ, Sch Cyber Sci & Technol, Beijing, Peoples R China
[5] Spotify Res, New York, NY USA
关键词
Sequential Recommendation; Transformer; Self-Attention; Uncertainty;
D O I
10.1145/3485447.3512077
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Sequential recommendation models the dynamics of a user's previous behaviors in order to forecast the next item, and has drawn a lot of attention. Transformer-based approaches, which embed items as vectors and use dot-product self-attention to measure the relationship between items, demonstrate superior capabilities among existing sequential methods. However, users' real-world sequential behaviors are uncertain rather than deterministic, posing a significant challenge to present techniques. We further suggest that dot-product-based approaches cannot fully capture collaborative transitivity, which can be derived in item-item transitions inside sequences and is beneficial for cold start items. We further argue that BPR loss has no constraint on positive and sampled negative items, which misleads the optimization. We propose a novel STOchastic Self-Attention (STOSA) to overcome these issues. STOSA, in particular, embeds each item as a stochastic Gaussian distribution, the covariance of which encodes the uncertainty. We devise a novel Wasserstein Self-Attention module to characterize item-item position-wise relationships in sequences, which effectively incorporates uncertainty into model training. Wasserstein attentions also enlighten the collaborative transitivity learning as it satisfies triangle inequality. Moreover, we introduce a novel regularization term to the ranking loss, which assures the dissimilarity between positive and the negative items. Extensive experiments on five real-world benchmark datasets demonstrate the superiority of the proposed model over state-of-the-art baselines, especially on cold start items. The code is available in https://github.com/zfan20/STOSA.
引用
收藏
页码:2036 / 2047
页数:12
相关论文
共 50 条
  • [1] Variational Self-attention Network for Sequential Recommendation
    Zhao, Jing
    Zhao, Pengpeng
    Zhao, Lei
    Liu, Yanchi
    Sheng, Victor S.
    Zhou, Xiaofang
    2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 1559 - 1570
  • [2] Stochastic shared embeddings and latent intent aware self-attention for sequential recommendation
    Wu, Di
    Ma, Wenli
    Yang, Lijun
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (33) : 78897 - 78922
  • [3] HSA: Hyperbolic Self-attention for Sequential Recommendation
    Hou, Peizhong
    Wang, Haiyang
    Li, Tianming
    Yan, Junchi
    WEB AND BIG DATA, PT III, APWEB-WAIM 2023, 2024, 14333 : 250 - 264
  • [4] AUBRec: adaptive augmented self-attention via user behaviors for sequential recommendation
    Fan, Jin
    Yu, Xiaofeng
    Wang, Zehao
    Wang, Weijie
    Sun, Danfeng
    Wu, Huifeng
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (24): : 21715 - 21728
  • [5] Sequential Recommendation via Temporal Self-Attention and Multi-Preference Learning
    Wang, Wenchao
    Zhu, Jinghua
    Xi, Heran
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT II, 2021, 12938 : 18 - 30
  • [6] AUBRec: adaptive augmented self-attention via user behaviors for sequential recommendation
    Jin Fan
    Xiaofeng Yu
    Zehao Wang
    Weijie Wang
    Danfeng Sun
    Huifeng Wu
    Neural Computing and Applications, 2022, 34 : 21715 - 21728
  • [7] Core Interests Focused Self-attention for Sequential Recommendation
    Ai, Zhengyang
    Wang, Shupeng
    Jia, Siyu
    Guo, Shu
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 306 - 314
  • [8] Weight Adjustment Framework for Self-Attention Sequential Recommendation
    Su, Zheng-Ang
    Zhang, Juan
    APPLIED SCIENCES-BASEL, 2024, 14 (09):
  • [9] Time Interval Aware Self-Attention for Sequential Recommendation
    Li, Jiacheng
    Wang, Yujie
    McAuley, Julian
    PROCEEDINGS OF THE 13TH INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM '20), 2020, : 322 - 330
  • [10] Self-Attention Based Sequential Recommendation With Graph Convolutional Networks
    Seng, Dewen
    Wang, Jingchang
    Zhang, Xuefeng
    IEEE ACCESS, 2024, 12 : 32780 - 32787