Personalized Decentralized Federated Learning with Knowledge Distillation

被引:1
|
作者
Jeong, Eunjeong [1 ]
Kountouris, Marios [1 ]
机构
[1] EURECOM, Commun Syst Dept, F-06410 Sophia Antipolis, France
关键词
decentralized federated learning; personalization; knowledge distillation;
D O I
10.1109/ICC45041.2023.10279714
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Personalization in federated learning (FL) functions as a coordinator for clients with high variance in data or behavior. Ensuring the convergence of these clients' models relies on how closely users collaborate with those with similar patterns or preferences. However, it is generally challenging to quantify similarity under limited knowledge about other users' models given to users in a decentralized network. To cope with this issue, we propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation techniques to empower each device so as to discern statistical distances between local models. Each client device can enhance its performance without sharing local data by estimating the similarity between two intermediate outputs from feeding local samples as in knowledge distillation. Our empirical studies demonstrate that the proposed algorithm improves the test accuracy of clients in fewer iterations under highly non-independent and identically distributed (non-i.i.d.) data distributions and is beneficial to agents with small datasets, even without the need for a central server.
引用
收藏
页码:1982 / 1987
页数:6
相关论文
共 50 条
  • [31] DKD-pFed: A novel framework for personalized federated learning via decoupling knowledge distillation and feature decorrelation
    Su, Liwei
    Wang, Donghao
    Zhu, Jinghua
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 259
  • [32] Decentralized Two-Stage Federated Learning with Knowledge Transfer
    Jin, Tong
    Chen, Siguang
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3181 - 3186
  • [33] Efficient Federated Learning for AIoT Applications Using Knowledge Distillation
    Liu, Tian
    Xia, Jun
    Ling, Zhiwei
    Fu, Xin
    Yu, Shui
    Chen, Mingsong
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (08) : 7229 - 7243
  • [34] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [35] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Yalin Song
    Hang Liu
    Shuai Zhao
    Haozhe Jin
    Junyang Yu
    Yanhong Liu
    Rui Zhai
    Longge Wang
    Pattern Analysis and Applications, 2024, 27 (4)
  • [36] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
    Lee, Gihun
    Jeong, Minchan
    Shin, Yongjin
    Bae, Sangmin
    Yun, Se-Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [37] Data-Free Knowledge Distillation for Heterogeneous Federated Learning
    Zhu, Zhuangdi
    Hong, Junyuan
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [38] Personalized federated learning via decoupling self-knowledge distillation and global adaptive aggregationPersonalized federated learning via decoupling self-knowledge...Z. Tang et al.
    ZhiWei Tang
    ShuWei Xu
    HaoZhe Jin
    ShiChong Liu
    Rui Zhai
    Ke Lu
    Multimedia Systems, 2025, 31 (2)
  • [39] Communication-Efficient Personalized Federated Edge Learning for Decentralized Sensing in ISAC
    Zhu, Yonghui
    Zhang, Ronghui
    Cui, Yuanhao
    Wu, Sheng
    Jiang, Chunxiao
    Jing, Xiaojun
    2023 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS, 2023, : 207 - 212
  • [40] Knowledge-Aware Parameter Coaching for Personalized Federated Learning
    Zhi, Mingjian
    Bi, Yuanguo
    Xu, Wenchao
    Wang, Haozhao
    Xiang, Tianao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 15, 2024, : 17069 - 17077