KEEP: An Industrial Pre-Training Framework for Online Recommendation via Knowledge Extraction and Plugging

被引:9
|
作者
Zhang, Yujing [1 ]
Chan, Zhangming [1 ]
Xu, Shuhao [2 ]
Bian, Weijie [1 ]
Han, Shuguang [1 ]
Deng, Hongbo [1 ]
Zheng, Bo [1 ]
机构
[1] Alibaba Grp, Beijing, Peoples R China
[2] Tsinghua Univ, Sch Software, Beijing, Peoples R China
关键词
Online Recommendation; Pre-training; Knowledge Extraction; Knowledge Plugging;
D O I
10.1145/3511808.3557106
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
An industrial recommender system generally presents a hybrid list that contains results from multiple subsystems. In practice, each subsystem is optimized with its own feedback data to avoid the disturbance among different subsystems. However, we argue that such data usage may lead to sub-optimal online performance because of the data sparsity. To alleviate this issue, we propose to extract knowledge from the super-domain that contains webscale and long-time impression data, and further assist the online recommendation task (downstream task). To this end, we propose a novel industrial KnowlEdge Extraction and Plugging (KEEP) framework, which is a two-stage framework that consists of 1) a supervised pre-training knowledge extraction module on superdomain, and 2) a plug-in network that incorporates the extracted knowledge into the downstream model. This makes it friendly for incremental training of online recommendation. Moreover, we design an efficient empirical approach for KEEP and introduce our hands-on experience during the implementation of KEEP in a largescale industrial system. Experiments conducted on two real-world datasets demonstrate that KEEP can achieve promising results. It is notable that KEEP has also been deployed on the display advertising system in Alibaba, bringing a lift of +5.4% CTR and +4.7% RPM.
引用
收藏
页码:3684 / 3693
页数:10
相关论文
共 50 条
  • [21] Pre-training Graph Neural Network for Cross Domain Recommendation
    Wang, Chen
    Liang, Yueqing
    Liu, Zhiwei
    Zhang, Tao
    Yu, Philip S.
    [J]. 2021 IEEE THIRD INTERNATIONAL CONFERENCE ON COGNITIVE MACHINE INTELLIGENCE (COGMI 2021), 2021, : 140 - 145
  • [22] Towards more effective encoders in pre-training for sequential recommendation
    Sun, Ke
    Qian, Tieyun
    Zhong, Ming
    Li, Xuhui
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 2801 - 2832
  • [23] UPRec: User-aware Pre-training for sequential Recommendation
    Xiao, Chaojun
    Xie, Ruobing
    Yao, Yuan
    Liu, Zhiyuan
    Sun, Maosong
    Zhang, Xu
    Lin, Leyu
    [J]. AI OPEN, 2023, 4 : 137 - 144
  • [24] PRKG: Pre-Training Representation and Knowledge-Graph-Enhanced Web Service Recommendation for Mashup Creation
    Cao, Buqing
    Peng, Mi
    Xie, Ziming
    Liu, Jianxun
    Ye, Hongfan
    Li, Bing
    Fletcher, Kenneth K.
    [J]. IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 1737 - 1749
  • [25] User Behavior Pre-training for Online Fraud Detection
    Liu, Can
    Gao, Yuncong
    Sun, Li
    Feng, Jinghua
    Yang, Hao
    Ao, Xiang
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 3357 - 3365
  • [26] A Pre-training Approach for Stance Classification in Online Forums
    Tshimula, Jean Marie
    Chikhaoui, Belkacem
    Wang, Shengrui
    [J]. 2020 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING (ASONAM), 2020, : 280 - 287
  • [27] PERM: Pre-training Question Embeddings via Relation Map for Improving Knowledge Tracing
    Wang, Wentao
    Ma, Huifang
    Zhao, Yan
    Yang, Fanyi
    Chang, Liang
    [J]. DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT III, 2022, : 281 - 288
  • [28] A Method of Relation Extraction Using Pre-training Models
    Wang, Yu
    Sun, Yining
    Ma, Zuchang
    Gao, Lisheng
    Xu, Yang
    Wu, Yichen
    [J]. 2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 176 - 179
  • [29] Numerical Tuple Extraction from Tables with Pre-training
    Yang, Qingping
    Cao, Yixuan
    Luo, Ping
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 2233 - 2241
  • [30] Contrastive Language-knowledge Graph Pre-training
    Yuan, Xiaowei
    Liu, Kang
    Wang, Yequan
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (04)