Personalized Knowledge Distillation for Recommender System

被引:11
|
作者
Kang, SeongKu [1 ]
Lee, Dongha [2 ]
Kweon, Wonbin [3 ]
Yu, Hwanjo [1 ]
机构
[1] POSTECH, Dept Comp Sci & Engn, Pohang, South Korea
[2] UIUC, Dept Comp Sci, Champaign, IL USA
[3] POSTECH, Dept Convergence IT Engn, Pohang, South Korea
关键词
Recommender System; Knowledge Distillation; Model compression; Retrieval efficiency;
D O I
10.1016/j.knosys.2021.107958
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Nowadays, Knowledge Distillation (KD) has been widely studied for recommender system. KD is a model-independent strategy that generates a small but powerful student model by transferring knowledge from a pre-trained large teacher model. Recent work has shown that the knowledge from the teacher's representation space significantly improves the student model. The state-of-the-art method, named Distillation Experts (DE), adopts cluster-wise distillation that transfers the knowledge of each representation cluster separately to distill the various preference knowledge in a balanced manner. However, it is challenging to apply DE to a new environment since its performance is highly dependent on several key assumptions and hyperparameters that need to be tuned for each dataset and each base model. In this work, we propose a novel method, dubbed Personalized Hint Regression (PHR), distilling the preference knowledge in a balanced way without relying on any assumption on the representation space nor any method-specific hyperparameters. To circumvent the clustering, PHR employs personalization network that enables a personalized distillation to the student space for each user/item representation, which can be viewed as a generalization of DE. Extensive experiments conducted on real-world datasets show that PHR achieves comparable or even better performance to DE tuned by a grid search for all of its hyperparameters. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Knowledge Modeling for Personalized Travel Recommender
    Khalid, Atifah
    Rapa'ee, Suriani
    Yassin, Norlidza Mohd
    Lukose, Dickson
    [J]. KNOWLEDGE TECHNOLOGY, 2012, 295 : 72 - 81
  • [2] Y DE-RRD: A Knowledge Distillation Framework for Recommender System
    Kang, SeongKu
    Hwang, Junyoung
    Kweon, Wonbin
    Yu, Hwanjo
    [J]. CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 605 - 614
  • [3] Topology Distillation for Recommender System
    Kang, SeongKu
    Hwang, Junyoung
    Kweon, Wonbin
    Yu, Hwanjo
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 829 - 839
  • [4] Personalized Education: Blind Knowledge Distillation
    Deng, Xiang
    Zheng, Jian
    Zhang, Zhongfei
    [J]. COMPUTER VISION, ECCV 2022, PT XXXIV, 2022, 13694 : 269 - 285
  • [5] Extending Knowledge Distillation for Personalized Federation
    Ge, Huanhuan
    [J]. ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, 2024, 14876 : 392 - 403
  • [6] Personalized Web Page Recommender System using integrated Usage and Content Knowledge
    Gopalachari, M. Venu
    Sammulal, Po
    [J]. 2014 INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION CONTROL AND COMPUTING TECHNOLOGIES (ICACCCT), 2014, : 1066 - 1071
  • [7] Recommender System: A Personalized TV Guide System
    de Avila, Paulo Muniz
    Zorzo, Sergio Donizetti
    [J]. E-BUSINESS AND TELECOMMUNICATIONS, 2011, 130 : 278 - 290
  • [8] Personalized recommender system for digital libraries
    Omisore, M.O.
    Samuel, O.W.
    [J]. International Journal of Web-Based Learning and Teaching Technologies, 2014, 9 (01) : 18 - 32
  • [9] Personalized Recommender System for Event Attendees
    Arens-Volland, Andreas
    Naudet, Yannick
    [J]. 2016 11TH INTERNATIONAL WORKSHOP ON SEMANTIC AND SOCIAL MEDIA ADAPTATION AND PERSONALIZATION (SMAP), 2016, : 65 - 70
  • [10] A personalized recommender system for SaaS services
    Afify, Yasmine M.
    Moawad, Ibrahim F.
    Badr, Nagwa L.
    Tolba, Mohamed F.
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2017, 29 (04):