Federated adaptive pruning with differential privacy

被引:0
|
作者
Wang, Zhousheng [1 ]
Shen, Jiahe [2 ]
Dai, Hua [2 ,3 ]
Xu, Jian [2 ]
Yang, Geng [2 ]
Zhou, Hao [2 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Modern Posts, Nanjing 210003, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Sch Comp Sci, Nanjing 210023, Peoples R China
[3] State Key Lab Tibetan Intelligent Informat Proc &, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; Lightweight machine learning; Data pruning; Differential privacy;
D O I
10.1016/j.future.2025.107783
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL), as an emerging distributed machine learning technique, reduces the computational burden on the central server through decentralization, while ensuring data privacy. It typically requires client sampling and local training for each iteration, followed by aggregation of the model on a central server. Although this distributed learning approach has positive implications for the preservation of privacy, it also increases the computational load of local clients. Therefore, lightweight efficient schemes become an indispensable tool to help reduce communication and computational costs in FL. In addition, due to the risk of model stealing attacks when uploaded, it is urgent to improve the level of privacy protection further. In this paper, we propose Federated Adaptive Pruning (FAP), a lightweight method that integrates FL with adaptive pruning by adjusting explicit regularization. We keep the model unchanged, but instead try to dynamically prune the data from large datasets during the training process to reduce the computational costs and enhance privacy protection. In each round of training, selected clients train with their local data and prune a portion of the data before uploading the model for server-side aggregation. The remaining data are reserved for subsequent computations. With this approach, selected clients can quickly refine their data at the beginning of training. In addition, we combine FAP with differential privacy to further strengthen data privacy. Through comprehensive experiments, we demonstrate the performance of FAP on different datasets with basic models, e.g., CNN, and MLP, just to mention a few. Numerous experimental results show that our method is able to significantly prune the datasets to reduce computational overhead with minimal loss of accuracy. Compared to previous methods, we can obtain the lowest training error, and further improve the data privacy of client-side.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Federated Learning with Bayesian Differential Privacy
    Triastcyn, Aleksei
    Faltings, Boi
    2019 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2019, : 2587 - 2596
  • [22] Privacy-Preserving Federated Learning for Industrial Edge Computing via Hybrid Differential Privacy and Adaptive Compression
    Jiang, Bin
    Li, Jianqiang
    Wang, Huihui
    Song, Houbing
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (02) : 1136 - 1144
  • [23] Efficient Wireless Federated Learning with Adaptive Model Pruning
    Chen, Zhixiong
    Yi, Wenqiang
    Lambotharan, Sangarapillai
    Nallanathan, Arumugam
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 7592 - 7597
  • [24] Client-Level Differential Privacy via Adaptive Intermediary in Federated Medical Imaging
    Jiang, Meirui
    Zhong, Yuan
    Le, Anjie
    Li, Xiaoxiao
    Dou, Qi
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT II, 2023, 14221 : 500 - 510
  • [25] An Efficient Differential Privacy Federated Learning Scheme with Optimal Adaptive Client Number K
    Wang, Jian
    Zhang, Mengwei
    Proceedings of SPIE - The International Society for Optical Engineering, 2023, 12587
  • [26] Adaptive differential privacy in asynchronous federated learning for aerial-aided edge computing
    Zhang, Yadong
    Zhang, Huixiang
    Yang, Yi
    Sun, Wen
    Zhang, Haibin
    Fu, Yaru
    JOURNAL OF NETWORK AND COMPUTER APPLICATIONS, 2025, 235
  • [27] AdaDP-CFL: Cluster Federated Learning with Adaptive Clipping Threshold Differential Privacy
    Yang, Tao
    Ma, Xuebin
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [28] Adaptive Model Pruning for Hierarchical Wireless Federated Learning
    Liu, Xiaonan
    Wang, Shiqiang
    Deng, Yansha
    Nallanathan, Arumugam
    2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [29] A Communication-Efficient, Privacy-Preserving Federated Learning Algorithm Based on Two-Stage Gradient Pruning and Differentiated Differential Privacy
    Li, Yong
    Du, Wei
    Han, Liquan
    Zhang, Zhenjian
    Liu, Tongtong
    SENSORS, 2023, 23 (23)
  • [30] Balancing Privacy and Performance: A Differential Privacy Approach in Federated Learning
    Tayyeh, Huda Kadhim
    AL-Jumaili, Ahmed Sabah Ahmed
    COMPUTERS, 2024, 13 (11)