Improving Transformer-based Sequential Recommenders through Preference Editing

被引:14
|
作者
Ma, Muyang [1 ]
Ren, Pengjie [1 ]
Chen, Zhumin [1 ]
Ren, Zhaochun [1 ]
Liang, Huasheng [2 ]
Ma, Jun [1 ]
De Rijke, Maarten [3 ]
机构
[1] Shandong Univ, Shandong, Peoples R China
[2] Tencent, WeChat, Beijing, Peoples R China
[3] Univ Amsterdam, Amsterdam, Netherlands
关键词
Transformer-based sequential recommendation; self-supervised learning; user preference extraction and representation;
D O I
10.1145/3564282
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One of the key challenges in sequential recommendation is how to extract and represent user preferences. Traditional methods rely solely on predicting the next item. But user behavior may be driven by complex preferences. Therefore, these methods cannot make accurate recommendations when the available information user behavior is limited. To explore multiple user preferences, we propose a transformer-based sequential recommendation model, named MrTransformer (Multi-preference Transformer). For training MrTransformer, we devise a preference-editing-based self-supervised learning (SSL) mechanism that explores extra supervision signals based on relations with other sequences. The idea is to force the sequential recommendation model to discriminate between common and unique preferences in different sequences of interactions. By doing so, the sequential recommendation model is able to disentangle user preferences into multiple independent preference representations so as to improve user preference extraction and representation. We carry out extensive experiments on five benchmark datasets. MrTransformer with preference editing significantly outperforms state-of-the-art sequential recommendation methods in terms of Recall, MRR, and NDCG. We find that long sequences of interactions from which user preferences are harder to extract and represent benefit most from preference editing.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Improving Transformer-based Sequential Conversational Recommendations through Knowledge Graph Embeddings
    Petruzzelli, Alessandro
    Martina, Alessandro Francesco Maria
    Spillo, Giuseppe
    Musto, Cataldo
    de Gemmis, Marco
    Lops, Pasquale
    Semeraro, Giovanni
    PROCEEDINGS OF THE 32ND ACM CONFERENCE ON USER MODELING, ADAPTATION AND PERSONALIZATION, UMAP 2024, 2024, : 172 - 182
  • [2] Personalization Through User Attributes for Transformer-Based Sequential Recommendation
    Fischer, Elisabeth
    Dallmann, Alexander
    Hotho, Andreas
    RECOMMENDER SYSTEMS IN FASHION AND RETAIL, 2023, 981 : 25 - 43
  • [3] Improving Conversational Recommender Systems via Transformer-based Sequential Modelling
    Zou, Jie
    Kanoulas, Evangelos
    Ren, Pengjie
    Ren, Zhaochun
    Sun, Aixin
    Long, Cheng
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 2319 - 2324
  • [4] Attention Calibration for Transformer-based Sequential Recommendation
    Zhou, Peilin
    Ye, Qichen
    Xie, Yueqi
    Gao, Jingqi
    Wang, Shoujin
    Kim, Jae Boum
    You, Chenyu
    Kim, Sunghun
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3595 - 3605
  • [5] Improving Transformer-based Program Repair Models through False Behavior Diagnosis
    Kim, Youngkyoung
    Kim, Misoo
    Leek, Eunseok
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 14010 - 14023
  • [6] Transformer-Based Rating-Aware Sequential Recommendation
    Li, Yang
    Li, Qianmu
    Meng, Shunmei
    Hou, Jun
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT I, 2022, 13155 : 759 - 774
  • [7] A Transformer-based Multi-Platform Sequential Estimation Fusion
    Zhai, Xupeng
    Yang, Yanbo
    Liu, Zhunga
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 144
  • [8] A transformer-based approach for improving app review response generation
    Zhang, Weizhe
    Gu, Wenchao
    Gao, Cuiyun
    Lyu, Michael R.
    SOFTWARE-PRACTICE & EXPERIENCE, 2023, 53 (02): : 438 - 454
  • [9] Transformer-based reranking for improving Korean morphological analysis systems
    Ryu, Jihee
    Lim, Soojong
    Kwon, Oh-Woog
    Na, Seung-Hoon
    ETRI JOURNAL, 2024, 46 (01) : 137 - 153
  • [10] Improving Efficiency and Robustness of Transformer-based Information Retrieval Systems
    Begoli, Edmon
    Srinivasan, Sudarshan
    Mahbub, Maria
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 3433 - 3435