Improving Global Generalization and Local Personalization for Federated Learning

被引:0
|
作者
Meng, Lei [1 ,2 ]
Qi, Zhuang [1 ]
Wu, Lei [1 ]
Du, Xiaoyu [3 ]
Li, Zhaochuan [4 ]
Cui, Lizhen [1 ]
Meng, Xiangxu [1 ]
机构
[1] Shandong Univ, Sch Software, Jinan 250101, Peoples R China
[2] Shandong Res Inst Ind Technol, Jinan 250098, Peoples R China
[3] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Peoples R China
[4] Inspur, Jinan 250101, Peoples R China
基金
中国国家自然科学基金;
关键词
Data models; Adaptation models; Optimization; Servers; Federated learning; Collaboration; Prototypes; Data heterogeneity; federated learning (FL); generalization; personalization; prototypical calibration; HEALTH;
D O I
10.1109/TNNLS.2024.3417452
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning aims to facilitate collaborative training among multiple clients with data heterogeneity in a privacy-preserving manner, which either generates the generalized model or develops personalized models. However, existing methods typically struggle to balance both directions, as optimizing one often leads to failure in another. To address the problem, this article presents a method named personalized federated learning via cross silo prototypical calibration (pFedCSPC) to enhance the consistency of knowledge of clients by calibrating features from heterogeneous spaces, which contributes to enhancing the collaboration effectiveness between clients. Specifically, pFedCSPC employs an adaptive aggregation method to offer personalized initial models to each client, enabling rapid adaptation to personalized tasks. Subsequently, pFedCSPC learns class representation patterns on clients by clustering, averages the representations within each cluster to form local prototypes, and aggregates them on the server to generate global prototypes. Meanwhile, pFedCSPC leverages global prototypes as knowledge to guide the learning of local representation, which is beneficial for mitigating the data imbalanced problem and preventing overfitting. Moreover, pFedCSPC has designed a cross-silo prototypical calibration (CSPC) module, which utilizes contrastive learning techniques to map heterogeneous features from different sources into a unified space. This can enhance the generalization ability of the global model. Experiments were conducted on four datasets in terms of performance comparison, ablation study, in-depth analysis, and case study, and the results verified that pFedCSPC achieves improvements in both global generalization and local personalization performance via calibrating cross-source features and strengthening collaboration effectiveness, respectively.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Bilateral Improvement in Local Personalization and Global Generalization in Federated Learning
    Wang, Yansong
    Xu, Hui
    Ali, Waqar
    Zhou, Xiangmin
    Shao, Jie
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (16): : 27099 - 27111
  • [2] Improving Generalization and Personalization in Model-Heterogeneous Federated Learning
    Zhang, Xiongtao
    Wang, Ji
    Bao, Weidong
    Zhang, Yaohong
    Zhu, Xiaomin
    Peng, Hao
    Zhao, Xiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [3] FedSoup: Improving Generalization and Personalization in Federated Learning via Selective Model Interpolation
    Chen, Minghui
    Jiang, Meirui
    Dou, Qi
    Wang, Zehua
    Li, Xiaoxiao
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT II, 2023, 14221 : 318 - 328
  • [4] Improving Generalization and Personalization in Long-Tailed Federated Learning via Classifier Retraining
    Li, Yuhang
    Liu, Tong
    Shen, Wenfeng
    Cui, Yangguang
    Lu, Weijia
    EURO-PAR 2024: PARALLEL PROCESSING, PART II, EURO-PAR 2024, 2024, 14802 : 408 - 423
  • [5] Adaptive Federated Dropout: Improving Communication Efficiency and Generalization for Federated Learning
    Bouacida, Nader
    Hou, Jiahui
    Zang, Hui
    Liu, Xin
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,
  • [6] Improving Generalization in Federated Learning by Seeking Flat Minima
    Caldarola, Debora
    Caputo, Barbara
    Ciccone, Marco
    COMPUTER VISION, ECCV 2022, PT XXIII, 2022, 13683 : 654 - 672
  • [7] Personalization Disentanglement for Federated Learning
    Yan, Peng
    Long, Guodong
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 318 - 323
  • [8] Federated Learning with Partial Model Personalization
    Pillutla, Krishna
    Malik, Kshitiz
    Mohamed, Abdelrahman
    Rabbat, Michael
    Sanjabi, Maziar
    Xiao, Lin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [9] Federated Reinforcement Learning For Fast Personalization
    Nadiger, Chetan
    Kumar, Anil
    Abdelhak, Sherine
    2019 IEEE SECOND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND KNOWLEDGE ENGINEERING (AIKE), 2019, : 123 - 127
  • [10] Survey of Personalization Techniques for Federated Learning
    Kulkarni, Viraj
    Kulkarni, Milind
    Pant, Aniruddha
    PROCEEDINGS OF THE 2020 FOURTH WORLD CONFERENCE ON SMART TRENDS IN SYSTEMS, SECURITY AND SUSTAINABILITY (WORLDS4 2020), 2020, : 794 - 797