Improving Transformer-based Sequential Recommenders through Preference Editing

被引:14
|
作者
Ma, Muyang [1 ]
Ren, Pengjie [1 ]
Chen, Zhumin [1 ]
Ren, Zhaochun [1 ]
Liang, Huasheng [2 ]
Ma, Jun [1 ]
De Rijke, Maarten [3 ]
机构
[1] Shandong Univ, Shandong, Peoples R China
[2] Tencent, WeChat, Beijing, Peoples R China
[3] Univ Amsterdam, Amsterdam, Netherlands
关键词
Transformer-based sequential recommendation; self-supervised learning; user preference extraction and representation;
D O I
10.1145/3564282
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
One of the key challenges in sequential recommendation is how to extract and represent user preferences. Traditional methods rely solely on predicting the next item. But user behavior may be driven by complex preferences. Therefore, these methods cannot make accurate recommendations when the available information user behavior is limited. To explore multiple user preferences, we propose a transformer-based sequential recommendation model, named MrTransformer (Multi-preference Transformer). For training MrTransformer, we devise a preference-editing-based self-supervised learning (SSL) mechanism that explores extra supervision signals based on relations with other sequences. The idea is to force the sequential recommendation model to discriminate between common and unique preferences in different sequences of interactions. By doing so, the sequential recommendation model is able to disentangle user preferences into multiple independent preference representations so as to improve user preference extraction and representation. We carry out extensive experiments on five benchmark datasets. MrTransformer with preference editing significantly outperforms state-of-the-art sequential recommendation methods in terms of Recall, MRR, and NDCG. We find that long sequences of interactions from which user preferences are harder to extract and represent benefit most from preference editing.
引用
收藏
页数:24
相关论文
共 50 条
  • [31] Monitoring Student Attendance Through Vision Transformer-based Iris Recognition
    Ennajar, Slimane
    Bouarifi, Walid
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (02) : 698 - 707
  • [32] Designing a Movie Recommendation System Through a Transformer-Based Embeddings Space
    Iglesias, Oscar I. R.
    Pardo, Carlos E. B.
    Lopez, Jose Onate
    Quintero, Christian G. M.
    2024 IEEE COLOMBIAN CONFERENCE ON COMMUNICATIONS AND COMPUTING, COLCOM 2024, 2024,
  • [33] Improving Rumor Detection by Promoting Information Campaigns With Transformer-Based Generative Adversarial Learning
    Ma, Jing
    Li, Jun
    Gao, Wei
    Yang, Yang
    Wong, Kam-Fai
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (03) : 2657 - 2670
  • [34] Improving Transformer-based Speech Recognition Systems with Compressed Structure and Speech Attributes Augmentation
    Li, Sheng
    Raj, Dabre
    Lu, Xugang
    Shen, Peng
    Kawahara, Tatsuya
    Kawai, Hisashi
    INTERSPEECH 2019, 2019, : 4400 - 4404
  • [35] DTMP-prime: A deep transformer-based model for predicting prime editing efficiency and PegRNA activity
    Alipanahi, Roghayyeh
    Safari, Leila
    Khanteymoori, Alireza
    MOLECULAR THERAPY NUCLEIC ACIDS, 2024, 35 (04):
  • [36] A simple transformer-based baseline for crowd tracking with Sequential Feature Aggregation and Hybrid Group Training
    Wang, Cui
    Wu, Zewei
    Ke, Wei
    Xiong, Zhang
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2024, 100
  • [37] Improving Sequential Model Editing with Fact Retrieval
    Han, Xiaoqi
    Li, Ru
    Tan, Hongye
    Wang, Yuanlong
    Chai, Qinghua
    Pan, Jeff Z.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 11209 - 11224
  • [38] Transformer-Based Approach to Melanoma Detection
    Cirrincione, Giansalvo
    Cannata, Sergio
    Cicceri, Giovanni
    Prinzi, Francesco
    Currieri, Tiziana
    Lovino, Marta
    Militello, Carmelo
    Pasero, Eros
    Vitabile, Salvatore
    SENSORS, 2023, 23 (12)
  • [39] Transformer-based Bug/Feature Classification
    Ozturk, Ceyhun E.
    Yilmaz, Eyup Halit
    Koksal, Omer
    2023 31ST SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE, SIU, 2023,
  • [40] Identifying suicidal emotions on social media through transformer-based deep learning
    Dheeraj Kodati
    Ramakrishnudu Tene
    Applied Intelligence, 2023, 53 : 11885 - 11917