FedDP-SA: Boosting Differentially Private Federated Learning via Local Data Set Splitting

被引:0
|
作者
Liu, Xuezheng [1 ]
Zhou, Yipeng [2 ]
Wu, Di [1 ]
Hu, Miao [1 ]
Hui Wang, Jessie [3 ,4 ]
Guizani, Mohsen [5 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangdong Key Lab Big Data Anal & Proc, Guangzhou 510006, Peoples R China
[2] Macquarie Univ, Fac Sci & Engn, Dept Comp, Sydney, NSW 2109, Australia
[3] Tsinghua Univ, Inst Network Sci & Cyberspace, Beijing 100084, Peoples R China
[4] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing 100084, Peoples R China
[5] Mohamed bin Zayed Univ Artificial Intelligence, Machine Learning Dept, Abu Dhabi, U Arab Emirates
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 19期
基金
中国国家自然科学基金;
关键词
Noise; Privacy; Computational modeling; Data models; Differential privacy; Accuracy; Internet of Things; Data splitting; federated learning (FL); Gaussian mechanism; sensitivity and convergence rate;
D O I
10.1109/JIOT.2024.3421991
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) emerges as an attractive collaborative machine learning framework that enables training of models across decentralized devices by merely exposing model parameters. However, malicious attackers can still hijack communicated parameters to expose clients' raw samples resulting in privacy leakage. To defend against such attacks, differentially private FL (DPFL) is devised, which incurs negligible computation overhead in protecting privacy by adding noises. Nevertheless, the low model utility and communication efficiency makes DPFL hard to be deployed in the real environment. To overcome these deficiencies, we propose a novel DPFL algorithm called FedDP-SA (namely, federated learning with differential privacy by splitting Local data sets and averaging parameters). Specifically, FedDP-SA splits a local data set into multiple subsets for parameter updating. Then, parameters averaged over all subsets plus differential privacy (DP) noises are returned to the parameter server. FedDP-SA offers dual benefits: 1) enhancing model accuracy by efficiently lowering sensitivity, thereby reducing noise to ensure DP and 2) improving communication efficiency by communicating model parameters with a lower frequency. These advantages are validated through sensitivity analysis and convergence rate analysis. Finally, we conduct comprehensive experiments to verify the performance of FedDP-SA compared with other state-of-the-art baseline algorithms.
引用
收藏
页码:31687 / 31698
页数:12
相关论文
共 42 条
  • [1] Boosting Accuracy of Differentially Private Continuous Data Release for Federated Learning
    Cai, Jianping
    Ye, Qingqing
    Hu, Haibo
    Liu, Ximeng
    Fu, Yanggeng
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2024, 19 : 10287 - 10301
  • [2] Differentially Private Federated Learning on Heterogeneous Data
    Noble, Maxence
    Bellet, Aurelien
    Dieuleveut, Aymeric
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [3] Local differentially private federated learning with homomorphic encryption
    Jianzhe Zhao
    Chenxi Huang
    Wenji Wang
    Rulin Xie
    Rongrong Dong
    Stan Matwin
    The Journal of Supercomputing, 2023, 79 : 19365 - 19395
  • [4] Differentially Private Federated Learning with Local Regularization and Sparsification
    Cheng, Anda
    Wang, Peisong
    Zhang, Xi Sheryl
    Cheng, Jian
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10112 - 10121
  • [5] Local differentially private federated learning with homomorphic encryption
    Zhao, Jianzhe
    Huang, Chenxi
    Wang, Wenji
    Xie, Rulin
    Dong, Rongrong
    Matwin, Stan
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (17): : 19365 - 19395
  • [6] PRIVATEFL: Accurate, Differentially Private Federated Learning via Personalized Data Transformation
    Yang, Yuchen
    Hui, Bo
    Yuan, Haolin
    Gong, Neil
    Cao, Yinzhi
    PROCEEDINGS OF THE 32ND USENIX SECURITY SYMPOSIUM, 2023, : 1595 - 1611
  • [7] Private Federated Submodel Learning via Private Set Union
    Wang, Zhusheng
    Ulukus, Sennur
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (04) : 2903 - 2921
  • [8] Distributionally Robust Federated Learning for Differentially Private Data
    Shi, Siping
    Hu, Chuang
    Wang, Dan
    Zhu, Yifei
    Han, Zhu
    2022 IEEE 42ND INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2022), 2022, : 842 - 852
  • [9] Boosting Accuracy of Differentially Private Federated Learning in Industrial IoT With Sparse Responses
    Cui, Laizhong
    Ma, Jiating
    Zhou, Yipeng
    Yu, Shui
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (01) : 910 - 920
  • [10] Differentially Private Federated Learning via Reconfigurable Intelligent Surface
    Yang, Yuhan
    Zhou, Yong
    Wu, Youlong
    Shi, Yuanming
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (20) : 19728 - 19743