Weighted distributed differential privacy ERM Convex and non-convex

被引:2
|
作者
Kang, Yilin [1 ,4 ]
Liu, Yong [1 ,2 ,3 ]
Niu, Ben [1 ]
Wang, Weiping [1 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, 89-A Minzhuang Rd, Beijing 100093, Peoples R China
[2] Renmin Univ China, Gaoling Sch Artificial Intelligence, 59 Zhongguancun St, Beijing 100872, Peoples R China
[3] Beijing Key Lab Big Data Management & Anal Method, Beijing, Peoples R China
[4] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Distributed machine learning; Differential privacy; Weighted parties; Empirical risk minimization; Strongly convex; Polyak-Ł ojasiewicz condition;
D O I
10.1016/j.cose.2021.102275
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distributed machine learning allows different parties to learn a single model over all data sets without disclosing their own data. In this paper, we propose a weighted distributed differentially private (WD-DP) empirical risk minimization (ERM) method to train a model in distributed setting, considering different weights of different clients. For the first time, we theoretically analyze the benefits brought by weighted paradigm in distributed differentially private machine learning. Our method advances the state-of-the-art differentially private ERM methods in distributed setting. By detailed theoretical analysis, we show that in distributed setting, the noise bound and the excess empirical risk bound can be improved by considering different weights held by multiple parties. Additionally, in some situations, the constraint: strongly convexity of the loss function in ERM is not easy to achieve, so we generalize our method to the condition that the loss function is not restricted to be strongly convex but satisfies the Polyak-Lojasiewicz condition. Experiments on real data sets show that our method is more reliable and we improve the performance of distributed differentially private ERM, especially in the case that data scales on different clients are uneven. Moreover, it is an attractive result that our distributed method achieves almost the same theoretical and experimental results as previous centralized methods. (c) 2021 Elsevier Ltd. All rights reserved.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Weighted distributed differential privacy ERM: Convex and non-convex
    Kang, Yilin
    Liu, Yong
    Wang, Weiping
    [J]. arXiv, 2019,
  • [2] Weighted distributed differential privacy ERM: Convex and non-convex
    Kang, Yilin
    Liu, Yong
    Niu, Ben
    Wang, Weiping
    [J]. Liu, Yong (liuyonggsai@ruc.edu.cn), 1600, Elsevier Ltd (106):
  • [3] Momentum Aggregation for Private Non-convex ERM
    Tran, Hoang
    Cutkosky, Ashok
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] Non-Convex Distributed Optimization
    Tatarenko, Tatiana
    Touri, Behrouz
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (08) : 3744 - 3757
  • [5] Nesting of non-convex figures in non-convex contours
    Vinade, C.
    Dias, A.
    [J]. Informacion Tecnologica, 2000, 11 (01): : 149 - 156
  • [6] Distributed Weighted Coverage for Multi-Robot Systems in Non-Convex Environment
    Cao, Kai
    Chen, Yangquan
    Gao, Song
    Dang, Haixin
    An, Di
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (14):
  • [7] Localization and Approximations for Distributed Non-convex Optimization
    Kao, Hsu
    Subramanian, Vijay
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 200 (02) : 463 - 500
  • [8] Localization and Approximations for Distributed Non-convex Optimization
    Hsu Kao
    Vijay Subramanian
    [J]. Journal of Optimization Theory and Applications, 2024, 200 : 463 - 500
  • [9] Convex non-convex image segmentation
    Raymond Chan
    Alessandro Lanza
    Serena Morigi
    Fiorella Sgallari
    [J]. Numerische Mathematik, 2018, 138 : 635 - 680
  • [10] Comments on convex and non-convex figures
    Tietze, H
    [J]. JOURNAL FUR DIE REINE UND ANGEWANDTE MATHEMATIK, 1929, 160 (1/4): : 67 - 69