Compression Boosts Differentially Private Federated Learning

被引:19
|
作者
Kerkouche, Raouf [1 ]
Acs, Gergely [2 ]
Castelluccia, Claude [1 ]
Geneves, Pierre [3 ]
机构
[1] Univ Grenoble Alpes, INRIA, Privat Team, F-38000 Grenoble, France
[2] BME HIT, Crysys Lab, Budapest, Hungary
[3] Univ Grenoble Alpes, CNRS, INRIA, Grenoble INP,LIG,Tyrex Team, F-38000 Grenoble, France
关键词
Federated Learning; Compressive Sensing; Differential Privacy; Compression; Denoising; Bandwidth Efficiency; Scalability;
D O I
10.1109/EuroSP51992.2021.00029
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning allows distributed entities to train a common model collaboratively without sharing their own data. Although it prevents data collection and aggregation by exchanging only parameter updates, it remains vulnerable to various inference and reconstruction attacks where a malicious entity can learn private information about the participants' training data from the captured gradients. Differential Privacy is used to obtain theoretically sound privacy guarantees against such inference attacks by noising the exchanged update vectors. However, the added noise is proportional to the model size which can be very large with modern neural networks. This can result in poor model quality. In this paper, compressive sensing is used to reduce the model size and hence increase model quality without sacrificing privacy. We show experimentally, using 2 datasets, that our privacy-preserving proposal can reduce the communication costs by up to 95% with only a negligible performance penalty compared to traditional non-private federated learning schemes.
引用
收藏
页码:304 / 318
页数:15
相关论文
共 50 条
  • [1] Adaptive compressed learning boosts both efficiency and utility of differentially private federated learning
    Li, Min
    Xiao, Di
    Chen, Lvjun
    SIGNAL PROCESSING, 2025, 227
  • [2] Differentially Private Federated Learning on Heterogeneous Data
    Noble, Maxence
    Bellet, Aurelien
    Dieuleveut, Aymeric
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [3] Differentially private federated learning with Laplacian smoothing
    Liang, Zhicong
    Wang, Bao
    Gu, Quanquan
    Osher, Stanley
    Yao, Yuan
    APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2024, 72
  • [4] The Skellam Mechanism for Differentially Private Federated Learning
    Agarwal, Naman
    Kairouz, Peter
    Liu, Ziyu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [5] Towards the Robustness of Differentially Private Federated Learning
    Qi, Tao
    Wang, Huili
    Huang, Yongfeng
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 18, 2024, : 19911 - 19919
  • [6] Differentially Private Federated Learning with Drift Control
    Chang, Wei-Ting
    Seif, Mohamed
    Tandon, Ravi
    2022 56TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2022, : 240 - 245
  • [7] Differentially Private Federated Temporal Difference Learning
    Zeng, Yiming
    Lin, Yixuan
    Yang, Yuanyuan
    Liu, Ji
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2714 - 2726
  • [8] Differentially private knowledge transfer for federated learning
    Qi, Tao
    Wu, Fangzhao
    Wu, Chuhan
    He, Liang
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2023, 14 (01)
  • [9] Differentially private knowledge transfer for federated learning
    Tao Qi
    Fangzhao Wu
    Chuhan Wu
    Liang He
    Yongfeng Huang
    Xing Xie
    Nature Communications, 14
  • [10] Local differentially private federated learning with homomorphic encryption
    Jianzhe Zhao
    Chenxi Huang
    Wenji Wang
    Rulin Xie
    Rongrong Dong
    Stan Matwin
    The Journal of Supercomputing, 2023, 79 : 19365 - 19395