Prototype-Decomposed Knowledge Distillation for Learning Generalized Federated Representation

被引:0
|
作者
Wu, Aming [1 ]
Yu, Jiaping [1 ]
Wang, Yuxuan [1 ]
Deng, Cheng [1 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710126, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Prototypes; Data models; Servers; Training; Feature extraction; Federated learning; Task analysis; Federated domain generalization; class prototypes; singular value decomposition; prototype decomposition; knowledge distillation;
D O I
10.1109/TMM.2024.3428352
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) enables distributed clients to collaboratively learn a global model, suggesting its potential for use in improving data privacy in machine learning. However, although FL has made many advances, its performance usually suffers from degradation due to the impact of domain shift when the trained models are applied to unseen domains. To enhance the model's generalization ability, we focus on solving federated domain generalization, which aims to properly generalize a federated model trained based on multiple source domains belonging to different distributions to an unseen target domain. A novel approach, namely Prototype-Decomposed Knowledge Distillation (PDKD), is proposed herein. Concretely, we first aggregate the local class prototypes that are learned from different clients. Subsequently, Singular Value Decomposition (SVD) is employed to decompose the local prototypes to obtain discriminative and generalized global prototypes that contain rich category-related information. Finally, the global prototypes are sent back to all clients. We exploit knowledge distillation to encourage local client models to distill generalized knowledge from the global prototypes, which boosts the generalization ability. Extensive experiments on multiple datasets demonstrate the effectiveness of our method. In particular, when implemented on the Office dataset, our method outperforms FedAvg by around 13.5%, which shows that our method is instrumental in ameliorating the generalization ability of federated models.
引用
收藏
页码:10991 / 11002
页数:12
相关论文
共 50 条
  • [31] Data-driven federated learning in drug discovery with knowledge distillation
    Hanser, Thierry
    Ahlberg, Ernst
    Amberg, Alexander
    Anger, Lennart T.
    Barber, Chris
    Brennan, Richard J.
    Brigo, Alessandro
    Delaunois, Annie
    Glowienke, Susanne
    Greene, Nigel
    Johnston, Laura
    Kuhn, Daniel
    Kuhnke, Lara
    Marchaland, Jean-Francois
    Muster, Wolfgang
    Plante, Jeffrey
    Rippmann, Friedrich
    Sabnis, Yogesh
    Schmidt, Friedemann
    van Deursen, Ruud
    Werner, Stephane
    White, Angela
    Wichard, Joerg
    Yukawa, Tomoya
    NATURE MACHINE INTELLIGENCE, 2025, : 423 - 436
  • [32] FedDK: Improving Cyclic Knowledge Distillation for Personalized Healthcare Federated Learning
    Xu, Yikai
    Fan, Hongbo
    IEEE ACCESS, 2023, 11 : 72409 - 72417
  • [33] Energy-Efficient Federated Knowledge Distillation Learning in Internet of Drones
    Cal, Semih
    Sun, Xiang
    Yao, Jingjing
    2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 1256 - 1261
  • [34] FEDGKD: Toward Heterogeneous Federated Learning via Global Knowledge Distillation
    Yao, Dezhong
    Pan, Wanning
    Dai, Yutong
    Wan, Yao
    Ding, Xiaofeng
    Yu, Chen
    Jin, Hai
    Xu, Zheng
    Sun, Lichao
    IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (01) : 3 - 17
  • [35] Knowledge Distillation Assisted Robust Federated Learning: Towards Edge Intelligence
    Qiao, Yu
    Adhikary, Apurba
    Kim, Ki Tae
    Zhang, Chaoning
    Hong, Choong Seon
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 843 - 848
  • [36] A Federated Domain Adaptation Algorithm Based on Knowledge Distillation and Contrastive Learning
    HUANG Fang
    FANG Zhijun
    SHI Zhicai
    ZHUANG Lehui
    LI Xingchen
    HUANG Bo
    WuhanUniversityJournalofNaturalSciences, 2022, 27 (06) : 499 - 507
  • [37] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [38] FedTKD: A Trustworthy Heterogeneous Federated Learning Based on Adaptive Knowledge Distillation
    Chen, Leiming
    Zhang, Weishan
    Dong, Cihao
    Zhao, Dehai
    Zeng, Xingjie
    Qiao, Sibo
    Zhu, Yichang
    Tan, Chee Wei
    ENTROPY, 2024, 26 (01)
  • [39] Personalized Federated Learning Method Based on Collation Game and Knowledge Distillation
    Sun Y.
    Shi Y.
    Li M.
    Yang R.
    Si P.
    Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2023, 45 (10): : 3702 - 3709
  • [40] Knowledge Distillation Based Defense for Audio Trigger Backdoor in Federated Learning
    Chen, Yu-Wen
    Ke, Bo-Hsu
    Chen, Bo-Zhong
    Chiu, Si-Rong
    Tu, Chun-Wei
    Kuo, Jian-Jhih
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 4271 - 4276