RecGPT: Generative Pre-training for Text-based Recommendation

被引:0
|
作者
Mang Ngo [1 ]
Dat Quoc Nguyen [1 ]
机构
[1] VinAI Res, Ho Chi Minh City, Vietnam
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We present the first domain-adapted and fully-trained large language model, RecGPT-7B, and its instruction-following variant, RecGPT-7B-Instruct, for text-based recommendation. Experimental results on rating prediction and sequential recommendation tasks show that our model, RecGPT-7B-Instruct, outperforms previous strong baselines. We are releasing our RecGPT models as well as their pretraining and fine-tuning datasets to facilitate future research and downstream applications in text-based recommendation. Public "hugging-face" links to our RecGPT models and datasets are available at: https://github.com/VinAIResearch/RecGPT.
引用
收藏
页码:302 / 313
页数:12
相关论文
共 50 条
  • [31] Pre-training Graph Transformer with Multimodal Side Information for Recommendation
    Liu, Yong
    Yang, Susen
    Lei, Chenyi
    Wang, Guoxin
    Tang, Haihong
    Zhang, Juyong
    Sun, Aixin
    Miao, Chunyan
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2853 - 2861
  • [32] Towards more effective encoders in pre-training for sequential recommendation
    Ke Sun
    Tieyun Qian
    Ming Zhong
    Xuhui Li
    World Wide Web, 2023, 26 : 2801 - 2832
  • [33] Pre-training a Transformer-Based Generative Model Using a Small Sepedi Dataset
    Ramalepe, Simon Phetole
    Modipa, Thipe I.
    Davel, Marelie H.
    ARTIFICIAL INTELLIGENCE RESEARCH, SACAIR 2024, 2025, 2326 : 319 - 333
  • [34] Pre-training Graph Neural Network for Cross Domain Recommendation
    Wang, Chen
    Liang, Yueqing
    Liu, Zhiwei
    Zhang, Tao
    Yu, Philip S.
    2021 IEEE THIRD INTERNATIONAL CONFERENCE ON COGNITIVE MACHINE INTELLIGENCE (COGMI 2021), 2021, : 140 - 145
  • [35] UPRec: User-aware Pre-training for sequential Recommendation
    Xiao, Chaojun
    Xie, Ruobing
    Yao, Yuan
    Liu, Zhiyuan
    Sun, Maosong
    Zhang, Xu
    Lin, Leyu
    AI OPEN, 2023, 4 : 137 - 144
  • [36] PUNR: Pre-training with User Behavior Modeling for News Recommendation
    Ma, Guangyuan
    Liu, Hongtao
    Wu, Xing
    Qian, Wanhui
    Lv, Zhepeng
    Yang, Qing
    Hu, Songlin
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 8338 - 8347
  • [37] Text-Guided HuBERT: Self-Supervised Speech Pre-Training via Generative Adversarial Networks
    Ma, Duo
    Yue, Xianghu
    Ao, Junyi
    Gao, Xiaoxue
    Li, Haizhou
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 2055 - 2059
  • [38] A Multi-strategy-based Pre-training Method for Cold-start Recommendation
    Hao, Bowen
    Yin, Hongzhi
    Zhang, Jing
    Li, Cuiping
    Chen, Hong
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (02)
  • [39] Algorithms For Cold-Start Game Recommendation Based On GNN Pre-training Model
    Yang, Hongjuan
    Tian, Gang
    Xu, Chengrui
    Wang, Rui
    COMPUTER JOURNAL, 2024, 67 (09): : 2787 - 2798
  • [40] Pre-training for Abstractive Document Summarization by Reinstating Source Text
    Zou, Yanyan
    Zhang, Xingxing
    Wei Lu
    Furu Wei
    Ming Zhou
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 3646 - 3660