Private Federated Submodel Learning via Private Set Union

被引:0
|
作者
Wang, Zhusheng [1 ]
Ulukus, Sennur [1 ]
机构
[1] Univ Maryland, Dept Elect & Comp Engn, College Pk, MD 20742 USA
关键词
Servers; Databases; Costs; Indexes; Training; Data privacy; Training data; Federated submodel learning; private set union; symmetric private information retrieval; INFORMATION-RETRIEVAL; CAPACITY; STORAGE;
D O I
10.1109/TIT.2023.3344717
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the federated submodel learning (FSL) problem and propose an approach where clients are able to update the central model information theoretically privately. Our approach is based on private set union (PSU), which is further based on multi-message symmetric private information retrieval (MM-SPIR). The server has two non-colluding databases which keep the model in a replicated manner. With our scheme, the server does not get to learn anything further than the subset of submodels updated by the clients: the server does not get to know which client updated which submodel(s), or anything about the local client data. In comparison to the state-of-the-art private FSL schemes of Jia-Jafar and Vithana-Ulukus, our scheme does not require noisy storage of the model at the databases; and in comparison to the secure aggregation scheme of Zhao-Sun, our scheme does not require pre-distribution of client-side common randomness, instead, our scheme creates the required client-side common randomness via random symmetric private information retrieval (RSPIR) and one-time pads. Our system is initialized with a replicated storage of submodels and a sufficient amount of common randomness at the two databases on the server-side. The protocol starts with a common randomness generation (CRG) phase where the two databases establish common randomness at the client-side using RSPIR and one-time pads (this phase is called FSL-CRG). Next, the clients utilize the established client-side common randomness to have the server determine privately the union of indices of submodels to be updated collectively by the clients (this phase is called FSL-PSU). Then, the two databases broadcast the current versions of the submodels in the set union to clients. The clients update the submodels based on their local training data. Finally, the clients use a variation of FSL-PSU to write the updates back to the databases privately (this phase is called FSL-write). As the databases at the server do not communicate, as a novel approach, we utilize carefully chosen alive clients to route the required information between the two databases. Our proposed private FSL scheme is robust against client drop-outs, client late-arrivals, and database drop-outs.
引用
收藏
页码:2903 / 2921
页数:19
相关论文
共 50 条
  • [1] Private Federated Submodel Learning with Sparsification
    Vithana, Sajani
    Ulukus, Sennur
    [J]. 2022 IEEE INFORMATION THEORY WORKSHOP (ITW), 2022, : 410 - 415
  • [2] Efficient Private Federated Submodel Learning
    Vithana, Sajani
    Ulukus, Sennur
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 3394 - 3399
  • [3] X-Secure T-Private Federated Submodel Learning
    Jia, Zhuqing
    Jafar, Syed A.
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [4] Rate Distortion Tradeoff in Private Read Update Write in Federated Submodel Learning
    Vithana, Sajani
    Ulukus, Sennur
    [J]. 2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 210 - 214
  • [5] Information-Theoretically Private Federated Submodel Learning With Storage Constrained Databases
    Vithana, Sajani
    Ulukus, Sennur
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2024, 70 (08) : 6041 - 6059
  • [6] X-Secure T-Private Federated Submodel Learning With Elastic Dropout Resilience
    Jia, Zhuqing
    Jafar, Syed Ali
    [J]. IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (08) : 5418 - 5439
  • [7] Differentially Private Set Union
    Gopi, Sivakanth
    Gulhane, Pankaj
    Kulkarni, Janardhan
    Shen, Judy Hanwen
    Shokouhi, Milad
    Yekhanin, Sergey
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [8] Differentially Private Set Union
    Gopi, Sivakanth
    Gulhane, Pankaj
    Kulkarni, Janardhan
    Shen, Judy Hanwen
    Shokouhi, Milad
    Yekhanin, Sergey
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [9] Multi-Party Private Set Intersection in Vertical Federated Learning
    Lu, Linpeng
    Ding, Ning
    [J]. 2020 IEEE 19TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2020), 2020, : 707 - 714
  • [10] Differentially Private Federated Learning via Reconfigurable Intelligent Surface
    Yang, Yuhan
    Zhou, Yong
    Wu, Youlong
    Shi, Yuanming
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (20) : 19728 - 19743