Regulating Clients' Noise Adding in Federated Learning without Verification

被引:0
|
作者
Hong, Shu [1 ]
Duan, Lingjie [1 ]
机构
[1] Singapore Univ Technol & Design, Singapore, Singapore
关键词
D O I
10.1109/ICC45041.2023.10279141
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
In federated learning (FL), clients cooperatively train a global model without revealing their raw data but gradients or parameters, while the local information can still be disclosed from local outputs transmitted to the parameter server. With such privacy concerns, a client may overly add artificial noise to his local updates to compromise the global model training, and we prove the selfish noise adding leads to an infinite price of anarchy (PoA). This paper proposes a novel pricing mechanism to regulate privacy-sensitive clients without verifying their parameter updates, unlike existing privacy mechanisms that assume the server's full knowledge of added noise. Without knowing the ground truth, our mechanism reaches the social optimum to best balance the global training error and privacy loss, according to the difference between a client's updated parameter and all clients' average parameter. We also improve the FL convergence bound by refining the aggregation rule at the server to account for different clients' noise variances. Moreover, we extend our pricing scheme to fit incomplete information of clients' privacy sensitivities, ensuring their truthful type reporting and the system's ex-ante budget balance. Simulations show that our pricing scheme greatly improves the system performance especially when clients have diverse privacy sensitivities.
引用
收藏
页码:1244 / 1249
页数:6
相关论文
共 50 条
  • [1] Efficient Federated Learning with Self-Regulating Clients
    Talukder Z.
    Islam M.A.
    Performance Evaluation Review, 2023, 50 (04): : 23 - 25
  • [2] Federated Learning of User Verification Models Without Sharing Embeddings
    Hosseini, Hossein
    Park, Hyunsin
    Yun, Sungrack
    Louizos, Christos
    Soriaga, Joseph
    Welling, Max
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [3] ON FEDERATED LEARNING WITH ENERGY HARVESTING CLIENTS
    Shen, Cong
    Yang, Jing
    Xu, Jie
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8657 - 8661
  • [4] Learning Cautiously in Federated Learning with Noisy and Heterogeneous Clients
    Wu, Chenrui
    Li, Zexi
    Wang, Fangxin
    Wu, Chao
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 660 - 665
  • [5] A quantum federated learning framework for classical clients
    Song, Yanqi
    Wu, Yusen
    Wu, Shengyao
    Li, Dandan
    Wen, Qiaoyan
    Qin, Sujuan
    Gao, Fei
    SCIENCE CHINA-PHYSICS MECHANICS & ASTRONOMY, 2024, 67 (05)
  • [6] Robust Federated Learning with Noisy and Heterogeneous Clients
    Fang, Xiuwen
    Ye, Mang
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10062 - 10071
  • [7] A quantum federated learning framework for classical clients
    Yanqi Song
    Yusen Wu
    Shengyao Wu
    Dandan Li
    Qiaoyan Wen
    Sujuan Qin
    Fei Gao
    Science China(Physics,Mechanics & Astronomy), 2024, Mechanics & Astronomy) (05) : 5 - 14
  • [8] DFHelper: Help clients to participate in federated learning tasks
    Zhenhao Wu
    Jianbo Gao
    Jiashuo Zhang
    Yue Li
    Qingshan Li
    Zhi Guan
    Zhong Chen
    Applied Intelligence, 2023, 53 : 12749 - 12773
  • [9] Provably Secure Federated Learning against Malicious Clients
    Cao, Xiaoyu
    Jia, Jinyuan
    Gong, Neil Zhenqiang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 6885 - 6893
  • [10] Temporal Adaptive Clustering for Heterogeneous Clients in Federated Learning
    Ali, Syed Saqib
    Kumar, Ajit
    Ali, Mazhar
    Singh, Ankit Kumar
    Choi, Bong Jun
    38TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN 2024, 2024, : 11 - 16