FedDyn: A dynamic and efficient federated distillation approach on Recommender System

被引:4
|
作者
Jin, Cheng [1 ]
Chen, Xuandong [1 ]
Gu, Yi [1 ]
Li, Qun [1 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Comp Sci & Technol, Nanjing, Peoples R China
关键词
Distributed Machine Learning; Federated Learning; Recommender System; Knowledge Distillation;
D O I
10.1109/ICPADS56603.2022.00107
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Learning (FL) is a popular distributed machine learning paradigm that enables devices to work together to train a centralized model without transmitting raw data. However, when the model becomes complex, mobile devices' communication overhead can be unacceptably large in traditional FL methods. To address this problem, Federated Distillation (FD) is proposed as a federated version of knowledge distillation. Most of the recent FD methods calculate the model output (logits) of each client as the local knowledge on a public proxy dataset and do distillation with the average of the clients' logits on the server side. Nevertheless, these FD methods are not robust and perform poorly in the non-IID (data is non-independent and non-identically distributed) scenario such as Federated Recommendation (FR). In order to eliminate the non-IID problem and apply FD in FR, we proposed a novel method named FedDyn to construct a proxy dataset and extract local knowledge dynamically in this paper. In this method, we replaced the average strategy with focus distillation to strengthen reliable knowledge, which solved the non-IID problem that the local model has biased knowledge. The average strategy is a dilution and perturbation of knowledge since it treats reliable and unreliable knowledge equally important. In addition, to prevent inference of private user information from local knowledge, we used a method like local differential privacy techniques to protect this knowledge on the client side. The experimental results showed that our method has a faster convergence speed and lower communication overhead than the baselines on three datasets, including MovieLens-100K, MovieLens-1M and Pinterest.
引用
收藏
页码:786 / 793
页数:8
相关论文
共 50 条
  • [1] A Simple and Efficient Federated Recommender System
    Jalalirad, Amir
    Scavuzzo, Marco
    Capota, Catalin
    Sprague, Michael
    [J]. BDCAT'19: PROCEEDINGS OF THE 6TH IEEE/ACM INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING, APPLICATIONS AND TECHNOLOGIES, 2019, : 53 - 58
  • [2] ReFRS: Resource-efficient Federated Recommender System for Dynamic and Diversified User Preferences
    Imran, Mubashir
    Yin, Hongzhi
    Chen, Tong
    Nguyen, Quoc Viet Hung
    Zhou, Alexander
    Zheng, Kai
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (03)
  • [3] A Locality Sensitive Hashing Based Approach for Federated Recommender System
    Hu, Hongsheng
    Dobbie, Gillian
    Salcic, Zoran
    Liu, Meng
    Zhang, Jianbing
    Zhang, Xuyun
    [J]. 2020 20TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING (CCGRID 2020), 2020, : 836 - 842
  • [4] Topology Distillation for Recommender System
    Kang, SeongKu
    Hwang, Junyoung
    Kweon, Wonbin
    Yu, Hwanjo
    [J]. KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 829 - 839
  • [5] An Efficient Federated Distillation Learning System for Multitask Time Series Classification
    Xing, Huanlai
    Xiao, Zhiwen
    Qu, Rong
    Zhu, Zonghai
    Zhao, Bowen
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71 : 1 - 1
  • [6] A Federated Recommender System for Online Services
    Tan, Ben
    Liu, Bo
    Zheng, Vincent
    Yang, Qiang
    [J]. RECSYS 2020: 14TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, 2020, : 579 - 581
  • [7] Horizontal Federated Recommender System: A Survey
    Wang, Lingyun
    Zhou, Hanlin
    Bao, Yinwei
    Yan, Xiaoran
    Shen, Guojiang
    Kong, Xiangjie
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (09)
  • [8] Efficient Knowledge Base Synchronization in Semantic Communication Network: A Federated Distillation Approach
    Lu, Xiaolan
    Zhu, Kun
    Li, Juan
    Zhang, Yang
    [J]. 2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [9] Personalized Knowledge Distillation for Recommender System
    Kang, SeongKu
    Lee, Dongha
    Kweon, Wonbin
    Yu, Hwanjo
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 239
  • [10] Recommendation Approach Based on Attentive Federated Distillation
    Chen M.
    Zhang L.
    Ma T.-Y.
    [J]. Ruan Jian Xue Bao/Journal of Software, 2021, 32 (12): : 3852 - 3868