Prototype-Decomposed Knowledge Distillation for Learning Generalized Federated Representation

被引:0
|
作者
Wu, Aming [1 ]
Yu, Jiaping [1 ]
Wang, Yuxuan [1 ]
Deng, Cheng [1 ]
机构
[1] Xidian Univ, Sch Elect Engn, Xian 710126, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
Prototypes; Data models; Servers; Training; Feature extraction; Federated learning; Task analysis; Federated domain generalization; class prototypes; singular value decomposition; prototype decomposition; knowledge distillation;
D O I
10.1109/TMM.2024.3428352
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) enables distributed clients to collaboratively learn a global model, suggesting its potential for use in improving data privacy in machine learning. However, although FL has made many advances, its performance usually suffers from degradation due to the impact of domain shift when the trained models are applied to unseen domains. To enhance the model's generalization ability, we focus on solving federated domain generalization, which aims to properly generalize a federated model trained based on multiple source domains belonging to different distributions to an unseen target domain. A novel approach, namely Prototype-Decomposed Knowledge Distillation (PDKD), is proposed herein. Concretely, we first aggregate the local class prototypes that are learned from different clients. Subsequently, Singular Value Decomposition (SVD) is employed to decompose the local prototypes to obtain discriminative and generalized global prototypes that contain rich category-related information. Finally, the global prototypes are sent back to all clients. We exploit knowledge distillation to encourage local client models to distill generalized knowledge from the global prototypes, which boosts the generalization ability. Extensive experiments on multiple datasets demonstrate the effectiveness of our method. In particular, when implemented on the Office dataset, our method outperforms FedAvg by around 13.5%, which shows that our method is instrumental in ameliorating the generalization ability of federated models.
引用
收藏
页码:10991 / 11002
页数:12
相关论文
共 50 条
  • [1] A Prototype-Based Knowledge Distillation Framework for Heterogeneous Federated Learning
    Lyu, Feng
    Tang, Cheng
    Deng, Yongheng
    Liu, Tong
    Zhang, Yongmin
    Zhang, Yaoxue
    2023 IEEE 43RD INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS, 2023, : 37 - 47
  • [2] Prototype Similarity Distillation for Communication-Efficient Federated Unsupervised Representation Learning
    Zhang, Chen
    Xie, Yu
    Chen, Tingbin
    Mao, Wenjie
    Yu, Bin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (11) : 6865 - 6876
  • [3] FedRCIL: Federated Knowledge Distillation for Representation based Contrastive Incremental Learning
    Psaltis, Athanasios
    Chatzikonstantinou, Christos
    Patrikakis, Charalampos Z.
    Daras, Petros
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3455 - 3464
  • [4] Global prototype distillation for heterogeneous federated learning
    Wu, Shu
    Chen, Jindou
    Nie, Xueli
    Wang, Yong
    Zhou, Xiancun
    Lu, Linlin
    Peng, Wei
    Nie, Yao
    Menhaj, Waseef
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [5] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [6] Knowledge Distillation in Federated Learning: A Practical Guide
    Mora, Alessio
    Tenison, Irene
    Bellavista, Paolo
    Rish, Irina
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8188 - 8196
  • [7] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [8] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [9] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [10] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563