Pivotal Role of Language Modeling in Recommender Systems: Enriching Task-specific and Task-agnostic Representation Learning

被引:0
|
作者
Shin, Kyuyong [1 ,2 ]
Kwak, Hanock [1 ]
Kim, Wonjae [2 ]
Jeong, Jisu [1 ,2 ]
Jung, Seungjae [1 ]
Kim, Kyung-Min [1 ,2 ]
Ha, Jung-Woo [2 ]
Lee, Sang-Woo [1 ,2 ]
机构
[1] NAVER, Seongnam, South Korea
[2] NAVER AI Lab, Seongnam, South Korea
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent studies have proposed unified user modeling frameworks that leverage user behavior data from various applications. Many of them benefit from utilizing users' behavior sequences as plain texts, representing rich information in any domain or system without losing generality. Hence, a question arises: Can language modeling for user history corpus help improve recommender systems? While its versatile usability has been widely investigated in many domains, its applications to recommender systems still remain underexplored. We show that language modeling applied directly to task-specific user histories achieves excellent results on diverse recommendation tasks. Also, leveraging additional task-agnostic user histories delivers significant performance benefits. We further demonstrate that our approach can provide promising transfer learning capabilities for a broad spectrum of real-world recommender systems, even on unseen domains and services.
引用
收藏
页码:1146 / 1161
页数:16
相关论文
共 26 条
  • [1] To BERT or Not to BERT: Comparing Task-specific and Task-agnostic Semi-Supervised Approaches for Sequence Tagging
    Bhattacharjee, Kasturi
    Ballesteros, Miguel
    Anubhai, Rishita
    Muresan, Smaranda
    Ma, Jie
    Ladhak, Faisal
    Al-Onaizan, Yaser
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7927 - 7934
  • [2] Task-agnostic representation learning of multimodal twitter data for downstream applications
    Ryan Rivas
    Sudipta Paul
    Vagelis Hristidis
    Evangelos E. Papalexakis
    Amit K. Roy-Chowdhury
    Journal of Big Data, 9
  • [3] Task-agnostic representation learning of multimodal twitter data for downstream applications
    Rivas, Ryan
    Paul, Sudipta
    Hristidis, Vagelis
    Papalexakis, Evangelos E.
    Roy-Chowdhury, Amit K.
    JOURNAL OF BIG DATA, 2022, 9 (01)
  • [4] DexBERT: Effective, Task-Agnostic and Fine-Grained Representation Learning of Android Bytecode
    Sun T.
    Allix K.
    Kim K.
    Zhou X.
    Kim D.
    Lo D.
    Bissyande T.F.
    Klein J.
    IEEE Transactions on Software Engineering, 2023, 49 (10) : 4691 - 4706
  • [5] Task-Agnostic Privacy-Preserving Representation Learning for Federated Learning against Attribute Inference Attacks
    Arevalo, Caridad Arroyo
    Noorbakhsh, Sayedeh Leila
    Dong, Yun
    Hong, Yuan
    Wang, Binghui
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 10909 - 10917
  • [6] Learning the Curriculum with Bayesian Optimization for Task-Specific Word Representation Learning
    Tsvetkov, Yulia
    Faruqui, Manaal
    Ling, Wang
    MacWhinney, Brian
    Dyer, Chris
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2016, : 130 - 139
  • [7] Task-specific method-agnostic metric for few-shot learning
    Wang, Heng
    Li, Yong
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (04): : 3115 - 3124
  • [8] Task-specific method-agnostic metric for few-shot learning
    Heng Wang
    Yong Li
    Neural Computing and Applications, 2023, 35 : 3115 - 3124
  • [9] Learning Task-Specific Representation for Novel Words in Sequence Labeling
    Peng, Minlong
    Zhang, Qi
    Xing, Xiaoyu
    Gui, Tao
    Fu, Jinlan
    Huang, Xuanjing
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5146 - 5152
  • [10] The role of medial prefrontal cortex in the representation of task-specific meaning
    Scott, SK
    Leff, A
    Blank, C
    Wise, RJS
    BRAIN AND COGNITION, 2001, 47 (1-2) : 126 - 129