SPM-FL: A Federated Learning Privacy-Protection Mechanism Based on Local Differential Privacy

被引:1
|
作者
Chen, Zhiyan [1 ]
Zheng, Hong [1 ]
机构
[1] Changchun Univ Technol, Sch Comp Sci & Engn, Changchun 130012, Peoples R China
关键词
federated learning; local differential privacy; privacy protection; deep learning;
D O I
10.3390/electronics13204091
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is a widely applied distributed machine learning method that effectively protects client privacy by sharing and computing model parameters on the server side, thus avoiding the transfer of data to third parties. However, information such as model weights can still be analyzed or attacked, leading to potential privacy breaches. Traditional federated learning methods often disturb models by adding Gaussian or Laplacian noise, but under smaller privacy budgets, the large variance of the noise adversely affects model accuracy. To address this issue, this paper proposes a Symmetric Partition Mechanism (SPM), which probabilistically perturbs the sign of local model weight parameters before model aggregation. This mechanism satisfies strict & varepsilon;-differential privacy, while introducing a variance constraint mechanism that effectively reduces the impact of noise interference on model performance. Compared with traditional methods, SPM generates smaller variance under the same privacy budget, thereby improving model accuracy and being applicable to scenarios with varying numbers of clients. Through theoretical analysis and experimental validation on multiple datasets, this paper demonstrates the effectiveness and privacy-protection capabilities of the proposed mechanism.
引用
收藏
页数:39
相关论文
共 50 条
  • [41] Personalized Federated Learning With Differential Privacy
    Hu, Rui
    Guo, Yuanxiong
    Li, Hongning
    Pei, Qingqi
    Gong, Yanmin
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (10) : 9530 - 9539
  • [42] Federated Learning with Bayesian Differential Privacy
    Triastcyn, Aleksei
    Faltings, Boi
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 2587 - 2596
  • [43] Balancing Privacy and Performance: A Differential Privacy Approach in Federated Learning
    Tayyeh, Huda Kadhim
    AL-Jumaili, Ahmed Sabah Ahmed
    COMPUTERS, 2024, 13 (11)
  • [44] Local Distribution Privacy in Federated Learning
    Stelldinger, Peer
    Ibrahim, Mustafa F. R.
    INTELLIGENT DISTRIBUTED COMPUTING XVI, IDC 2023, 2024, 1138 : 9 - 12
  • [45] Fed-NILM: A federated learning-based non-intrusive load monitoring method for privacy-protection
    Wang, Haijin
    Si, Caomingzhe
    Liu, Guolong
    Zhao, Junhua
    Wen, Fushuan
    Xue, Yusheng
    Energy Conversion and Economics, 2022, 3 (02): : 51 - 60
  • [46] Local Differential Privacy Based Membership-Privacy-Preserving Federated Learning for Deep-Learning-Driven Remote Sensing
    Zhang, Zheng
    Ma, Xindi
    Ma, Jianfeng
    REMOTE SENSING, 2023, 15 (20)
  • [47] Federated learning data protection scheme based on personalized differential privacy in psychological evaluation
    Liu, Xueyan
    Wang, Jia
    Xiong, Xin
    Sun, Hao
    NEUROCOMPUTING, 2025, 611
  • [48] Acceleration offloading for differential privacy protection based on federated learning in edge intelligent controllers
    Shang, Wenli
    Ge, Jingbo
    Ding, Lei
    Jiang, Zhenbang
    Sui, He
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 163
  • [49] Federated Learning Model with Adaptive Differential Privacy Protection in Medical IoT
    Ni, Lina
    Huang, Peng
    Wei, Yongshan
    Shu, Minglei
    Zhang, Jinquan
    WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2021, 2021
  • [50] Trajectory Privacy-Protection Mechanism Based on Multidimensional Spatial-Temporal Prediction
    Xi, Ji
    Shi, Meiyu
    Zhang, Weiqi
    Xu, Zhe
    Liu, Yanting
    SYMMETRY-BASEL, 2024, 16 (09):