SanMove: next location recommendation via self-attention network

被引:0
|
作者
Wang, Bin [1 ]
Li, Huifeng [1 ]
Tong, Le [1 ]
Zhang, Qian [1 ]
Zhu, Sulei [1 ]
Yang, Tao [2 ]
机构
[1] Shanghai Normal Univ, Shanghai, Peoples R China
[2] Shanghai Urban & Rural Construct & Traff Dev Res I, Shanghai, Peoples R China
关键词
Next location prediction; Self-attention network; Auxiliary information; PREDICTION;
D O I
10.1108/DTA-03-2022-0093
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
PurposeThis paper aims to address the following issues: (1) most existing methods are based on recurrent network, which is time-consuming to train long sequences due to not allowing for full parallelism; (2) personalized preference generally are not considered reasonably; (3) existing methods rarely systematically studied how to efficiently utilize various auxiliary information (e.g. user ID and time stamp) in trajectory data and the spatiotemporal relations among nonconsecutive locations.Design/methodology/approachThe authors propose a novel self-attention network-based model named SanMove to predict the next location via capturing the long- and short-term mobility patterns of users. Specifically, SanMove uses a self-attention module to capture each user's long-term preference, which can represent her personalized location preference. Meanwhile, the authors use a spatial-temporal guided noninvasive self-attention (STNOVA) module to exploit auxiliary information in the trajectory data to learn the user's short-term preference.FindingsThe authors evaluate SanMove on two real-world datasets. The experimental results demonstrate that SanMove is not only faster than the state-of-the-art recurrent neural network (RNN) based predict model but also outperforms the baselines for next location prediction.Originality/valueThe authors propose a self-attention-based sequential model named SanMove to predict the user's trajectory, which comprised long-term and short-term preference learning modules. SanMove allows full parallel processing of trajectories to improve processing efficiency. They propose an STNOVA module to capture the sequential transitions of current trajectories. Moreover, the self-attention module is used to process historical trajectory sequences in order to capture the personalized location preference of each user. The authors conduct extensive experiments on two check-in datasets. The experimental results demonstrate that the model has a fast training speed and excellent performance compared with the existing RNN-based methods for next location prediction.
引用
收藏
页码:330 / 343
页数:14
相关论文
共 50 条
  • [41] Image Editing via Segmentation Guided Self-Attention Network
    Zhang, Jianfu
    Yang, Peiming
    Wang, Wentao
    Hong, Yan
    Zhang, Liqing
    IEEE SIGNAL PROCESSING LETTERS, 2020, 27 : 1605 - 1609
  • [42] Context-and category-aware double self-attention model for next POI recommendation
    Wang, Dongjing
    Wan, Feng
    Yu, Dongjin
    Shen, Yi
    Xiang, Zhengzhe
    Xu, Yueshen
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18355 - 18380
  • [43] Context-and category-aware double self-attention model for next POI recommendation
    Dongjing Wang
    Feng Wan
    Dongjin Yu
    Yi Shen
    Zhengzhe Xiang
    Yueshen Xu
    Applied Intelligence, 2023, 53 : 18355 - 18380
  • [44] Mechanics of Next Token Prediction with Self-Attention
    Li, Yingcong
    Huang, Yixiao
    Ildiz, M. Emrullah
    Rawat, Ankit Singh
    Oymak, Samet
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [45] Feature-Level Deeper Self-Attention Network With Contrastive Learning for Sequential Recommendation
    Hao, Yongjing
    Zhang, Tingting
    Zhao, Pengpeng
    Liu, Yanchi
    Sheng, Victor S.
    Xu, Jiajie
    Liu, Guanfeng
    Zhou, Xiaofang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (10) : 10112 - 10124
  • [46] Shared-Account Cross-Domain Sequential Recommendation with Self-Attention Network
    Guo L.
    Li Q.
    Liu F.
    Wang X.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2021, 58 (11): : 2524 - 2537
  • [47] A time-aware self-attention based neural network model for sequential recommendation
    Zhang, Yihu
    Yang, Bo
    Liu, Haodong
    Li, Dongsheng
    APPLIED SOFT COMPUTING, 2023, 133
  • [48] Hashtag Recommendation Using LSTM Networks with Self-Attention
    Shen, Yatian
    Li, Yan
    Sun, Jun
    Ding, Wenke
    Shi, Xianjin
    Zhang, Lei
    Shen, Xiajiong
    He, Jing
    CMC-COMPUTERS MATERIALS & CONTINUA, 2019, 61 (03): : 1261 - 1269
  • [49] Core Interests Focused Self-attention for Sequential Recommendation
    Ai, Zhengyang
    Wang, Shupeng
    Jia, Siyu
    Guo, Shu
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 306 - 314
  • [50] Weight Adjustment Framework for Self-Attention Sequential Recommendation
    Su, Zheng-Ang
    Zhang, Juan
    APPLIED SCIENCES-BASEL, 2024, 14 (09):