Federated adaptive pruning with differential privacy

被引:0
|
作者
Wang, Zhousheng [1 ]
Shen, Jiahe [2 ]
Dai, Hua [2 ,3 ]
Xu, Jian [2 ]
Yang, Geng [2 ]
Zhou, Hao [2 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Modern Posts, Nanjing 210003, Peoples R China
[2] Nanjing Univ Posts & Telecommun, Sch Comp Sci, Nanjing 210023, Peoples R China
[3] State Key Lab Tibetan Intelligent Informat Proc &, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; Lightweight machine learning; Data pruning; Differential privacy;
D O I
10.1016/j.future.2025.107783
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL), as an emerging distributed machine learning technique, reduces the computational burden on the central server through decentralization, while ensuring data privacy. It typically requires client sampling and local training for each iteration, followed by aggregation of the model on a central server. Although this distributed learning approach has positive implications for the preservation of privacy, it also increases the computational load of local clients. Therefore, lightweight efficient schemes become an indispensable tool to help reduce communication and computational costs in FL. In addition, due to the risk of model stealing attacks when uploaded, it is urgent to improve the level of privacy protection further. In this paper, we propose Federated Adaptive Pruning (FAP), a lightweight method that integrates FL with adaptive pruning by adjusting explicit regularization. We keep the model unchanged, but instead try to dynamically prune the data from large datasets during the training process to reduce the computational costs and enhance privacy protection. In each round of training, selected clients train with their local data and prune a portion of the data before uploading the model for server-side aggregation. The remaining data are reserved for subsequent computations. With this approach, selected clients can quickly refine their data at the beginning of training. In addition, we combine FAP with differential privacy to further strengthen data privacy. Through comprehensive experiments, we demonstrate the performance of FAP on different datasets with basic models, e.g., CNN, and MLP, just to mention a few. Numerous experimental results show that our method is able to significantly prune the datasets to reduce computational overhead with minimal loss of accuracy. Compared to previous methods, we can obtain the lowest training error, and further improve the data privacy of client-side.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Bidirectional adaptive differential privacy federated learning scheme
    Li, Yang
    Xu, Jin
    Zhu, Jianming
    Wang, Youwei
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2024, 51 (03): : 158 - 169
  • [2] Differential Privacy Federated Learning Based on Adaptive Adjustment
    Cheng, Yanjin
    Li, Wenmin
    Qin, Sujuan
    Tu, Tengfei
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (03): : 4777 - 4795
  • [3] Dynamic Personalized Federated Learning with Adaptive Differential Privacy
    Yang, Xiyuan
    Huang, Wenke
    Ye, Mang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [4] An adaptive federated learning scheme with differential privacy preserving
    Wu, Xiang
    Zhang, Yongting
    Shi, Minyu
    Li, Pei
    Li, Ruirui
    Xiong, Neal N.
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2022, 127 : 362 - 372
  • [5] Efficient adaptive defense scheme for differential privacy in federated learning
    Shan, Fangfang
    Lu, Yanlong
    Li, Shuaifeng
    Mao, Shiqi
    Li, Yuang
    Wang, Xin
    JOURNAL OF INFORMATION SECURITY AND APPLICATIONS, 2025, 89
  • [6] Adaptive differential privacy in vertical federated learning for mobility forecasting
    Errounda, Fatima Zahra
    Liu, Yan
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 149 : 531 - 546
  • [7] Compressed Federated Learning Based on Adaptive Local Differential Privacy
    Miao, Yinbin
    Xie, Rongpeng
    Li, Xinghua
    Liu, Ximeng
    Ma, Zhuo
    Deng, Robert H.
    PROCEEDINGS OF THE 38TH ANNUAL COMPUTER SECURITY APPLICATIONS CONFERENCE, ACSAC 2022, 2022, : 159 - 170
  • [8] Adaptive Differential Privacy Algorithm for Federated Learning on Small Datasets
    Xia, Lei
    Yang, Huanbo
    2024 3RD INTERNATIONAL CONFERENCE ON ROBOTICS, ARTIFICIAL INTELLIGENCE AND INTELLIGENT CONTROL, RAIIC 2024, 2024, : 497 - 502
  • [9] AdaSTopk: Adaptive federated shuffle model based on differential privacy
    Yang, Qiantao
    Du, Xuehui
    Liu, Aodi
    Wang, Na
    Wang, Wenjuan
    Wu, Xiangyu
    INFORMATION SCIENCES, 2023, 642
  • [10] Multi-Stage Asynchronous Federated Learning With Adaptive Differential Privacy
    Li, Yanan
    Yang, Shusen
    Ren, Xuebin
    Shi, Liang
    Zhao, Cong
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (02) : 1243 - 1256