Selective knowledge sharing for privacy-preserving federated distillation without a good teacher

被引:4
|
作者
Shao, Jiawei [1 ]
Wu, Fangzhao [2 ]
Zhang, Jun [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Microsoft Res Asia, Beijing, Peoples R China
关键词
D O I
10.1038/s41467-023-44383-9
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
While federated learning (FL) is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Federated distillation (FD) emerges as an alternative paradigm to tackle these challenges, which transfers knowledge among clients instead of model parameters. Nevertheless, challenges arise due to variations in local data distributions and the absence of a well-trained teacher model, which leads to misleading and ambiguous knowledge sharing that significantly degrades model performance. To address these issues, this paper proposes a selective knowledge sharing mechanism for FD, termed Selective-FD, to identify accurate and precise knowledge from local and ensemble predictions, respectively. Empirical studies, backed by theoretical insights, demonstrate that our approach enhances the generalization capabilities of the FD framework and consistently outperforms baseline methods. We anticipate our study to enable a privacy-preserving, communication-efficient, and heterogeneity-adaptive federated training framework. While federated learning is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Here, the authors show a federated distillation method to tackle these challenges, which leverages the strengths of knowledge distillation in a federated learning setting.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Selective knowledge sharing for privacy-preserving federated distillation without a good teacher
    Jiawei Shao
    Fangzhao Wu
    Jun Zhang
    Nature Communications, 15
  • [2] Privacy-Preserving Federated Data Sharing
    Fioretto, Ferdinando
    Van Hentenryck, Pascal
    AAMAS '19: PROCEEDINGS OF THE 18TH INTERNATIONAL CONFERENCE ON AUTONOMOUS AGENTS AND MULTIAGENT SYSTEMS, 2019, : 638 - 646
  • [3] FedGKD: Federated Graph Knowledge Distillation for privacy-preserving rumor detection
    Zheng, Peng
    Dou, Yong
    Yan, Yeqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [4] Ensemble Attention Distillation for Privacy-Preserving Federated Learning
    Gong, Xuan
    Sharma, Abhishek
    Karanam, Srikrishna
    Wu, Ziyan
    Chen, Terrence
    Doermann, David
    Innanje, Arun
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 15056 - 15066
  • [5] Federated Learning With Privacy-Preserving Ensemble Attention Distillation
    Gong, Xuan
    Song, Liangchen
    Vedula, Rishi
    Sharma, Abhishek
    Zheng, Meng
    Planche, Benjamin
    Innanje, Arun
    Chen, Terrence
    Yuan, Junsong
    Doermann, David
    Wu, Ziyan
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (07) : 2057 - 2067
  • [6] Complementary Knowledge Distillation for Robust and Privacy-Preserving Model Serving in Vertical Federated Learning
    Gao, Dashan
    Wan, Sheng
    Fan, Lixin
    Yao, Xin
    Yang, Qiang
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 18, 2024, : 19832 - 19839
  • [7] Privacy-Preserving Federated Distillation GAN for CIDSs in Industrial CPSs
    Liang, Junwei
    Sadiq, Muhammad
    Cai, Tie
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5512 - 5517
  • [8] Privacy-Preserving Heterogeneous Personalized Federated Learning with Knowledge
    Pan Y.
    Su Z.
    Ni J.
    Wang Y.
    Zhou J.
    IEEE Transactions on Network Science and Engineering, 2024, 11 (06): : 1 - 14
  • [9] Federated Learning with Blockchain for Privacy-Preserving Data Sharing in Internet of Vehicles
    Wenxian Jiang
    Mengjuan Chen
    Jun Tao
    China Communications, 2023, 20 (03) : 69 - 85
  • [10] Federated Analysis for Privacy-Preserving Data Sharing: A Technical and Legal Primer
    Casaletto, James
    Bernier, Alexander
    McDougall, Robyn
    Cline, Melissa S.
    ANNUAL REVIEW OF GENOMICS AND HUMAN GENETICS, 2023, 24 : 347 - 368