A efficient and robust privacy-preserving framework for cross-device federated learning

被引:4
|
作者
Du, Weidong [1 ,2 ]
Li, Min [1 ]
Wu, Liqiang [2 ]
Han, Yiliang [2 ]
Zhou, Tanping [2 ]
Yang, Xiaoyuan [2 ]
机构
[1] Xian Hitech Res Inst, Xian 710025, Peoples R China
[2] Engn Univ PAP, Coll Cryptog, Xian 710086, Peoples R China
基金
中国国家自然科学基金;
关键词
Cross-device federated learning; Privacy-preserving; Robust; Efficient; Collusion-attack resistant; Post-quantum security;
D O I
10.1007/s40747-023-00978-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To ensure no private information is leaked in the aggregation phase in federated learning (FL), many frameworks use homomorphic encryption (HE) to mask local model updates. However, the heavy overheads of these frameworks make them unsuitable for cross-device FL, where the clients are a huge number of mobile and edge devices with limited computing resources. Even worse, some of them also fail to manage the dynamic changes of clients. To overcome these shortcomings, we propose a threshold multi-key HE scheme tMK-CKKS and design an efficient and robust privacy-preserving FL framework. Robustness means that our framework allows clients to join in or drop out during the training process. Besides, because our tMK-CKKS scheme can pack multiple messages in a single ciphertext, our framework significantly reduces the computation and communication overhead. Moreover, the threshold mechanism in tMK-CKKS ensures that our framework can resist collusion attacks between the server and no more than t (threshold value) curious internal clients. Finally, we implement our framework in FedML and conduct extensive experiments to evaluate our framework. Utility evaluations on 6 benchmark datasets show that our framework can protect privacy without sacrificing the model accuracy. Efficiency evaluations on 4 typical deep learning models demonstrate that: our framework can speed up the computation by at least 1.21x over xMK-CKKS-based framework, 15.84x over Batchcrypt-based framework, and 20.30x over CRT-Paillier-based framework. Our framework can reduce the communication burden by at least 8.61 MB over Batchcrypt-based framework, 35.36 MB over xMK-CKKS-based framework and 42.58 MB over CRT-Paillier-based framework. The advantages in both computation and communication expand with the size of deep learning models.
引用
收藏
页码:4923 / 4937
页数:15
相关论文
共 50 条
  • [1] A efficient and robust privacy-preserving framework for cross-device federated learning
    Weidong Du
    Min Li
    Liqiang Wu
    Yiliang Han
    Tanping Zhou
    Xiaoyuan Yang
    [J]. Complex & Intelligent Systems, 2023, 9 : 4923 - 4937
  • [2] FLZip: An Efficient and Privacy-Preserving Framework for Cross-Silo Federated Learning
    Feng, Xiaojie
    Du, Haizhou
    [J]. IEEE CONGRESS ON CYBERMATICS / 2021 IEEE INTERNATIONAL CONFERENCES ON INTERNET OF THINGS (ITHINGS) / IEEE GREEN COMPUTING AND COMMUNICATIONS (GREENCOM) / IEEE CYBER, PHYSICAL AND SOCIAL COMPUTING (CPSCOM) / IEEE SMART DATA (SMARTDATA), 2021, : 209 - 216
  • [3] Cross-device privacy-preserving personal digital information management framework
    Al Saffar, Meshaal
    Knottenbelt, William
    [J]. 2019 FIRST INTERNATIONAL CONFERENCE ON DIGITAL DATA PROCESSING (DDP), 2019, : 41 - 50
  • [4] Robust privacy-preserving federated learning framework for IoT devices
    Han, Zhaoyang
    Zhou, Lu
    Ge, Chunpeng
    Li, Juan
    Liu, Zhe
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (11) : 9655 - 9673
  • [5] Efficient and Privacy-Preserving Byzantine-robust Federated Learning
    Luan, Shijie
    Lu, Xiang
    Zhang, Zhuangzhuang
    Chang, Guangsheng
    Guo, Yunchuan
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2202 - 2208
  • [6] PEPFL:A framework for a practical and efficient privacy-preserving federated learning
    Yange Chen
    Baocang Wang
    Hang Jiang
    Pu Duan
    Yuan Ping
    Zhiyong Hong
    [J]. Digital Communications and Networks., 2024, 10 (02) - 368
  • [7] BSR-FL: An Efficient Byzantine-Robust Privacy-Preserving Federated Learning Framework
    Zeng, Honghong
    Li, Jie
    Lou, Jiong
    Yuan, Shijing
    Wu, Chentao
    Zhao, Wei
    Wu, Sijin
    Wang, Zhiwen
    [J]. IEEE TRANSACTIONS ON COMPUTERS, 2024, 73 (08) : 2096 - 2110
  • [8] Word2Vec-based efficient privacy-preserving shared representation learning for federated recommendation system in a cross-device setting
    Lee, Taek-Ho
    Kim, Suhyeon
    Lee, Junghye
    Jun, Chi-Hyuck
    [J]. INFORMATION SCIENCES, 2023, 651
  • [9] An Efficient Federated Learning Framework for Privacy-Preserving Data Aggregation in IoT
    Shi, Rongquan
    Wei, Lifei
    Zhang, Lei
    [J]. 2023 20TH ANNUAL INTERNATIONAL CONFERENCE ON PRIVACY, SECURITY AND TRUST, PST, 2023, : 385 - 391
  • [10] FLCP: federated learning framework with communication-efficient and privacy-preserving
    Yang, Wei
    Yang, Yuan
    Xi, Yingjie
    Zhang, Hailong
    Xiang, Wei
    [J]. APPLIED INTELLIGENCE, 2024, 54 (9-10) : 6816 - 6835