Personalized Decentralized Federated Learning with Knowledge Distillation

被引:1
|
作者
Jeong, Eunjeong [1 ]
Kountouris, Marios [1 ]
机构
[1] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
关键词
decentralized federated learning; personalization; knowledge distillation;
D O I
10.1109/ICC45041.2023.10279714
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.
引用
收藏
页码:1982 / 1987
页数:6
相关论文
共 50 条
  • [41] Privacy-Preserving Heterogeneous Personalized Federated Learning with Knowledge
    Pan Y.
    Su Z.
    Ni J.
    Wang Y.
    Zhou J.
    IEEE Transactions on Network Science and Engineering, 2024, 11 (06): : 1 - 14
  • [42] Robust Multi-model Personalized Federated Learning via Model Distillation
    Muhammad, Adil
    Lin, Kai
    Gao, Jian
    Chen, Bincai
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446
  • [43] A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
    Su, Caiyu
    Wei, Jinri
    Lei, Yuan
    Li, Jiahui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [44] Federated Learning via Decentralized Dataset Distillation in Resource-Constrained Edge Environments
    Song, Rui
    Liu, Dai
    Chen, Dave Zhenyu
    Festag, Andreas
    Trinitis, Carsten
    Schulz, Martin
    Knoll, Alois
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [45] Data-driven federated learning in drug discovery with knowledge distillation
    Thierry Hanser
    Ernst Ahlberg
    Alexander Amberg
    Lennart T. Anger
    Chris Barber
    Richard J. Brennan
    Alessandro Brigo
    Annie Delaunois
    Susanne Glowienke
    Nigel Greene
    Laura Johnston
    Daniel Kuhn
    Lara Kuhnke
    Jean-François Marchaland
    Wolfgang Muster
    Jeffrey Plante
    Friedrich Rippmann
    Yogesh Sabnis
    Friedemann Schmidt
    Ruud van Deursen
    Stéphane Werner
    Angela White
    Joerg Wichard
    Tomoya Yukawa
    Nature Machine Intelligence, 2025, 7 (3) : 423 - 436
  • [46] A Network Resource Aware Federated Learning Approach using Knowledge Distillation
    Mishra, Rahul
    Gupta, Hari Prabhat
    Dutta, Tanima
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,
  • [47] FEDGKD: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
    Yao, Dezhong
    Pan, Wanning
    Dai, Yutong
    Wan, Yao
    Ding, Xiaofeng
    Yu, Chen
    Jin, Hai
    Xu, Zheng
    Sun, Lichao
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (01) : 3 - 17
  • [48] A Federated Domain Adaptation Algorithm Based on Knowledge Distillation and Contrastive Learning
    HUANG Fang
    FANG Zhijun
    SHI Zhicai
    ZHUANG Lehui
    LI Xingchen
    HUANG Bo
    Wuhan University Journal of Natural Sciences, 2022, 27 (06) : 499 - 507
  • [49] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
    Chen, Leiming
    Zhang, Weishan
    Dong, Cihao
    Zhao, Dehai
    Zeng, Xingjie
    Qiao, Sibo
    Zhu, Yichang
    Tan, Chee Wei
    ENTROPY, 2024, 26 (01)
  • [50] Knowledge Distillation Based Defense for Audio Trigger Backdoor in Federated Learning
    Chen, Yu-Wen
    Ke, Bo-Hsu
    Chen, Bo-Zhong
    Chiu, Si-Rong
    Tu, Chun-Wei
    Kuo, Jian-Jhih
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 4271 - 4276