Differentially Private Federated Learning With Importance Client Sampling

被引:0
|
作者
Chen, Lin [1 ,2 ]
Ding, Xiaofeng [1 ,2 ]
Li, Mengqi [1 ,2 ]
Jin, Hai [1 ,2 ]
机构
[1] Huazhong Univ Sci & Technol, Natl Engn Res Ctr Big Data Technol & Syst Lab, Serv Comp Technol & Syst Lab, Cluster & Grid Comp Lab, Wuhan 430074, Peoples R China
[2] Huazhong Univ Sci & Technol, Sch Comp Sci & Technol, Wuhan 430074, Peoples R China
基金
中国国家自然科学基金;
关键词
Servers; Distributed databases; Consumer electronics; Privacy; Convergence; Data models; Federated learning; federated learning; differential privacy; client sampling; adaptive optimization;
D O I
10.1109/TCE.2023.3338464
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
As numerous consumer electronics applications like smartphones and wearables generate lots of distributed data daily, consumers desire to safely and efficiently tackle private and isolated data. Federated learning (FL) is hopeful to satisfy the above requirement due to strong data security and applicability to large-scale scenarios. But diverse clients inevitably cause non-independent and identically distributed (non-iid) data among clients, which severely hinders performance analysis. Besides, affected by non-iid data, the participating clients are typically heterogeneous, which induces the client sampling problem. More importantly, albeit FL can enhance privacy via data localization, for highly secret data like physiological data from wearables, FL should possess stronger security to prevent third-party attacks. For data heterogeneity, client sampling, and privacy security, we propose differential privacy (DP) enabled and importance-aware FL algorithm DPFLICS to jointly handle these problems. Specifically, we utilize the truncated concentrated DP to tightly track the end-to-end privacy loss. To attain better sampling, the server selects partial clients with the probability derived from our importance client sampling. Moreover, to further improve performance, we also leverage the adaptive YOGI optimizer on the server side, which is an adaptive gradient method improved from the widely-used ADAM optimization. Finally, the multiple experiments exhibit the effectiveness of our method.
引用
收藏
页码:3635 / 3649
页数:15
相关论文
共 50 条
  • [1] Differentially Private Federated Learning with Shuffling and Client Self-Sampling
    Girgis, Antonious M.
    Data, Deepesh
    Diggavi, Suhas
    [J]. 2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 338 - 343
  • [2] iSample: Intelligent Client Sampling in Federated Learning
    Imani, HamidReza
    Anderson, Jeff
    El-Ghazawi, Tarek
    [J]. 6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 58 - 65
  • [3] A General Theory for Client Sampling in Federated Learning
    Fraboni, Yann
    Vidal, Richard
    Kameni, Laetitia
    Lorenzi, Marco
    [J]. TRUSTWORTHY FEDERATED LEARNING, FL 2022, 2023, 13448 : 46 - 58
  • [4] Towards Differentially Private Over-the-Air Federated Learning via Device Sampling
    Hu, Zihao
    Yan, Jia
    Zhang, Ying-Jun Angela
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5292 - 5298
  • [5] Differentially Private Federated Learning with Functional Mechanism
    Cao, Shi-Xiang
    Chen, Chao-Meng
    Tang, Peng
    Su, Sen
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2023, 46 (10): : 2178 - 2195
  • [6] Compression Boosts Differentially Private Federated Learning
    Kerkouche, Raouf
    Acs, Gergely
    Castelluccia, Claude
    Geneves, Pierre
    [J]. 2021 IEEE EUROPEAN SYMPOSIUM ON SECURITY AND PRIVACY (EUROS&P 2021), 2021, : 304 - 318
  • [7] Differentially private knowledge transfer for federated learning
    Qi, Tao
    Wu, Fangzhao
    Wu, Chuhan
    He, Liang
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2023, 14 (01)
  • [8] Towards the Robustness of Differentially Private Federated Learning
    Qi, Tao
    Wang, Huili
    Huang, Yongfeng
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 18, 2024, : 19911 - 19919
  • [9] Differentially Private Federated Learning with Drift Control
    Chang, Wei-Ting
    Seif, Mohamed
    Tandon, Ravi
    [J]. 2022 56TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2022, : 240 - 245
  • [10] Differentially Private Federated Temporal Difference Learning
    Zeng, Yiming
    Lin, Yixuan
    Yang, Yuanyuan
    Liu, Ji
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (11) : 2714 - 2726