Selective knowledge sharing for privacy-preserving federated distillation without a good teacher

被引:4
|
作者
Shao, Jiawei [1 ]
Wu, Fangzhao [2 ]
Zhang, Jun [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Hong Kong, Peoples R China
[2] Microsoft Res Asia, Beijing, Peoples R China
关键词
D O I
10.1038/s41467-023-44383-9
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
While federated learning (FL) is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Federated distillation (FD) emerges as an alternative paradigm to tackle these challenges, which transfers knowledge among clients instead of model parameters. Nevertheless, challenges arise due to variations in local data distributions and the absence of a well-trained teacher model, which leads to misleading and ambiguous knowledge sharing that significantly degrades model performance. To address these issues, this paper proposes a selective knowledge sharing mechanism for FD, termed Selective-FD, to identify accurate and precise knowledge from local and ensemble predictions, respectively. Empirical studies, backed by theoretical insights, demonstrate that our approach enhances the generalization capabilities of the FD framework and consistently outperforms baseline methods. We anticipate our study to enable a privacy-preserving, communication-efficient, and heterogeneity-adaptive federated training framework. While federated learning is promising for efficient collaborative learning without revealing local data, it remains vulnerable to white-box privacy attacks, suffers from high communication overhead, and struggles to adapt to heterogeneous models. Here, the authors show a federated distillation method to tackle these challenges, which leverages the strengths of knowledge distillation in a federated learning setting.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Federated learning for privacy-preserving AI
    Cheng, Yong
    Liu, Yang
    Chen, Tianjian
    Yang, Qiang
    COMMUNICATIONS OF THE ACM, 2020, 63 (12) : 33 - 36
  • [22] Privacy-Preserving and Reliable Federated Learning
    Lu, Yi
    Zhang, Lei
    Wang, Lulu
    Gao, Yuanyuan
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 346 - 361
  • [23] FedKGRec: privacy-preserving federated knowledge graph aware recommender system
    Ma, Xiao
    Zhang, Hongyu
    Zeng, Jiangfeng
    Duan, Yiqi
    Wen, Xuan
    APPLIED INTELLIGENCE, 2024, 54 (19) : 9028 - 9044
  • [24] Privacy-preserving data mining through knowledge model sharing
    Department of Computer Science, University of Texas, San Antonio
    Lect. Notes Comput. Sci., 2008, (97-115):
  • [25] Privacy-preserving data mining through knowledge model sharing
    Sharkey, Patrick
    Tian, Hongwei
    Zhang, Weining
    Xu, Shouhuai
    PRIVACY, SECURITY, AND TRUST IN KDD, 2008, 4890 : 97 - 115
  • [26] Privacy-Preserving and Traceable Federated Learning for data sharing in industrial IoT applications
    Chen, Junbao
    Xue, Jingfeng
    Wang, Yong
    Huang, Lu
    Baker, Thar
    Zhou, Zhixiong
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [27] A Federated Learning Based Privacy-Preserving Data Sharing Scheme for Internet of Vehicles
    Wang, Yangpeng
    Xiong, Ling
    Niu, Xianhua
    Wang, Yunxiang
    Liang, Dexin
    FRONTIERS IN CYBER SECURITY, FCS 2022, 2022, 1726 : 18 - 33
  • [28] PPIDSG: A Privacy-Preserving Image Distribution Sharing Scheme with GAN in Federated Learning
    Ma, Yuting
    Yao, Yuanzhi
    Xu, Xiaohua
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 14272 - 14280
  • [29] Multi-Level ACE-based IoT Knowledge Sharing for Personalized Privacy-Preserving Federated Learning
    Wang, Jing
    Lin, Xi
    Wu, Jun
    Mao, Qinghua
    Pei, Bei
    Li, Jianhua
    Guo, Suchang
    Zhang, Baitao
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 843 - 848
  • [30] Communication-Efficient and Privacy-Preserving Federated Learning via Joint Knowledge Distillation and Differential Privacy in Bandwidth-Constrained Networks
    Gad, Gad
    Gad, Eyad
    Fadlullah, Zubair Md
    Fouda, Mostafa M.
    Kato, Nei
    IEEE Transactions on Vehicular Technology, 2024, 73 (11) : 17586 - 17601