Communication-Efficient and Privacy-Preserving Aggregation in Federated Learning With Adaptability

被引:0
|
作者
Sun, Xuehua [1 ,2 ,3 ]
Yuan, Zengsen [4 ]
Kong, Xianguang [1 ]
Xue, Liang [5 ,6 ]
He, Lang [5 ]
Lin, Ying [7 ]
机构
[1] Xidian Univ, Coll Mech & Elect Engn, Xian 710071, Peoples R China
[2] Xian Univ Posts & Telecommun, Sch Comp Sci & Technol, Shaanxi Key Lab Network Data Anal & Intelligent Pr, Xian 710121, Peoples R China
[3] Xian Univ Posts & Telecommun, Xian Key Lab Big Data & Intelligent Comp, Xian 710121, Peoples R China
[4] Xian Univ Posts & Telecommun, Coll Sch Comp Sci & Technol, Xian 710121, Peoples R China
[5] Univ Guelph, Sch Comp Sci, Guelph, ON N1G 2W1, Canada
[6] Xian Univ Posts & Telecommun, Sch Comp Sci, Xian 710121, Peoples R China
[7] Shaanxi Hanlin Holdings Grp Co ltd, Xian 710000, Peoples R China
来源
IEEE INTERNET OF THINGS JOURNAL | 2024年 / 11卷 / 15期
基金
中国国家自然科学基金;
关键词
Adaptive gradient clipping; communication efficiency; differential privacy (DP); federated learning (FL); STRATEGY;
D O I
10.1109/JIOT.2024.3396217
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) aims to protect data privacy while aggregating models. Existing works rarely focus simultaneously on the three issues of communication efficiency, privacy, and utility, which are the three main challenges facing FL. Specifically, sensitive information about the training data can still be inferred from the model parameters shared in FL. In recent years, differential privacy (DP) has been applied in FL to protect data privacy. The challenge of implementing DP in FL lies in the detrimental impact of DP noise on model accuracy. The DP noise affects the convergence of the model, leading to additional communication overhead. Moreover, considering the inherently high-communication costs of FL, FL process can be inefficient or even infeasible. In view of these, we propose a novel differentially private FL (DPFL) scheme named Adap-FedITK, which aims to achieve low-communication overhead and high-model accuracy while guaranteeing client-level DP. Specifically, we dynamically adjust the gradient clipping threshold for different clients in each round, based on the heterogeneity of gradients. This approach aims to mitigate the negative impact of DP and achieve a privacy-utility tradeoff. To alleviate the high-communication overhead problem in FL, we introduce an improved top-k algorithm, which utilizes sparsity and quantization to compress the model eliminates communication redundancy, and it also integrates coding techniques to further reduce communication. Extensive experimental results demonstrate that our method achieves the privacy-utility tradeoff and improves communication efficiency while ensuring client-level DPFL.
引用
收藏
页码:26430 / 26443
页数:14
相关论文
共 50 条
  • [1] Communication-Efficient and Privacy-Preserving Verifiable Aggregation for Federated Learning
    Peng, Kaixin
    Shen, Xiaoying
    Gao, Le
    Wang, Baocang
    Lu, Yichao
    [J]. ENTROPY, 2023, 25 (08)
  • [2] Communication-Efficient Personalized Federated Learning With Privacy-Preserving
    Wang, Qian
    Chen, Siguang
    Wu, Meng
    [J]. IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 2374 - 2388
  • [3] Privacy-preserving and communication-efficient federated learning in Internet of Things
    Fang, Chen
    Guo, Yuanbo
    Hu, Yongjin
    Ma, Bowen
    Feng, Li
    Yin, Anqi
    [J]. COMPUTERS & SECURITY, 2021, 103 (103)
  • [4] FLCP: federated learning framework with communication-efficient and privacy-preserving
    Yang, Wei
    Yang, Yuan
    Xi, Yingjie
    Zhang, Hailong
    Xiang, Wei
    [J]. APPLIED INTELLIGENCE, 2024, 54 (9-10) : 6816 - 6835
  • [5] Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning
    Wang, Feng
    Gursoy, M. Cenk
    Velipasalar, Senem
    [J]. 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3875 - 3880
  • [6] Communication-efficient and privacy-preserving large-scale federated learning counteracting heterogeneity
    Zhou, Xingcai
    Yang, Guang
    [J]. INFORMATION SCIENCES, 2024, 661
  • [7] FedNew: A Communication-Efficient and Privacy-Preserving Newton-Type Method for Federated Learning
    Elgabli, Anis
    Issaid, Chaouki B.
    Bedi, Amrit S.
    Rajawat, Ketan
    Bennis, Mehdi
    Aggarwal, Vaneet
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [8] Privacy-Preserving Communication-Efficient Federated Multi-Armed Bandits
    Li, Tan
    Song, Linqi
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2022, 40 (03) : 773 - 787
  • [9] Communication-Efficient Privacy-Preserving Clustering
    Jagannathan, Geetha
    Pillaipakkamnatt, Krishnan
    Wright, Rebecca N.
    Umano, Daryl
    [J]. TRANSACTIONS ON DATA PRIVACY, 2010, 3 (01) : 2 - 26
  • [10] Privacy-Preserving and Communication-Efficient Energy Prediction Scheme Based on Federated Learning for Smart Grids
    Badr, Mahmoud M.
    Mahmoud, Mohamed M. E. A.
    Fang, Yuguang
    Abdulaal, Mohammed
    Aljohani, Abdulah Jeza
    Alasmary, Waleed
    Ibrahem, Mohamed I.
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (09) : 7719 - 7736