An Incremental Learning framework for Large-scale CTR Prediction

被引:3
|
作者
Katsileros, Petros [1 ,2 ]
Mandilaras, Nikiforos [1 ,2 ]
Mallis, Dimitrios [1 ,2 ]
Pitsikalis, Vassilis [1 ,2 ]
Theodorakis, Stavros [1 ,2 ]
Chamiel, Gil [2 ]
机构
[1] Deeplab, Athens, Greece
[2] Taboola Com, Tel Aviv, Israel
关键词
Knowledge Distillation; Warm-Start; Incremental Learning; CTR prediction;
D O I
10.1145/3523227.3547390
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this work we introduce an incremental learning framework for Click-Through-Rate (CTR) prediction and demonstrate its effectiveness for Taboola's massive-scale recommendation service. Our approach enables rapid capture of emerging trends through warm-starting from previously deployed models and fine tuning on "fresh" data only. Past knowledge is maintained via a teacher-student paradigm, where the teacher acts as a distillation technique, mitigating the catastrophic forgetting phenomenon. Our incremental learning framework enables significantly faster training and deployment cycles (x12 speedup). We demonstrate a consistent Revenue Per Mille (RPM) lift over multiple traffic segments and a significant CTR increase on newly introduced items.
引用
收藏
页码:490 / 493
页数:4
相关论文
共 50 条
  • [1] Feature Staleness Aware Incremental Learning for CTR Prediction
    Wang, Zhikai
    Shen, Yanyan
    Zhang, Zibin
    Lin, Kangyi
    [J]. PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 2352 - 2360
  • [2] Always Strengthen Your Strengths: A Drift-Aware Incremental Learning Framework for CTR Prediction
    Liu, Congcong
    Teng, Fei
    Zhao, Xiwei
    Lin, Zhangang
    Hu, Jinghe
    Shao, Jingping
    [J]. PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 1806 - 1810
  • [3] An Incremental One Class Learning Framework for Large Scale Data
    Deng, Qilin
    Yang, Yi
    Shen, Furao
    Luo, Chaomin
    Zhao, Jinxi
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 411 - 418
  • [4] Incremental Learning of Random Forests for Large-Scale Image Classification
    Ristin, Marko
    Guillaumin, Matthieu
    Gall, Juergen
    Van Gool, Luc
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2016, 38 (03) : 490 - 503
  • [5] Incremental Learning of NCM Forests for Large-Scale Image Classification
    Ristin, Marko
    Guillaumin, Matthieu
    Gall, Juergen
    Van Gool, Luc
    [J]. 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 3654 - 3661
  • [6] Ensemble Learning for Large-Scale Workload Prediction
    Singh, Nidhi
    Rao, Shrisha
    [J]. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2014, 2 (02) : 149 - 165
  • [7] Large Scale Incremental Learning
    Wu, Yue
    Chen, Yinpeng
    Wang, Lijuan
    Ye, Yuancheng
    Liu, Zicheng
    Guo, Yandong
    Fu, Yun
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 374 - 382
  • [8] A Large-Scale Ensemble Learning Framework for Demand Forecasting
    Park, Young-Jin
    Kim, Donghyun
    Odermatt, Frederic
    Lee, Juho
    Kim, Kyung-Min
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 378 - 387
  • [9] An online incremental learning support vector machine for large-scale data
    Jun Zheng
    Furao Shen
    Hongjun Fan
    Jinxi Zhao
    [J]. Neural Computing and Applications, 2013, 22 : 1023 - 1035
  • [10] An Online Incremental Learning Support Vector Machine for Large-scale Data
    Zheng, Jun
    Yu, Hui
    Shen, Furao
    Zhao, Jinxi
    [J]. ARTIFICIAL NEURAL NETWORKS-ICANN 2010, PT II, 2010, 6353 : 76 - +