Covariance-Aware Private Mean Estimation Without Private Covariance Estimation

被引:0
|
作者
Brown, Gavin [1 ]
Gaboardi, Marco [1 ]
Smith, Adam [1 ]
Ullman, Jonathan [2 ]
Zakynthinou, Lydia [2 ]
机构
[1] Boston Univ, Dept Comp Sci, Boston, MA 02215 USA
[2] Northeastern Univ, Khoury Coll Comp Sci, Boston, MA 02115 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
present two sample-efficient differentially private mean estimators for ddimensional (sub)Gaussian distributions with unknown covariance. Informally, given n greater than or similar to d/alpha(2) samples from such a distribution with mean mu and covariance mu(similar to), our estimators output broken vertical bar broken vertical bar mu such that k vertical bar vertical bar(similar to)mu - mu vertical bar vertical bar Sigma <= alpha, where vertical bar vertical bar (.) vertical bar vertical bar Sigma is the Mahalanobis distance. All previous estimators with the same guarantee either require strong a priori bounds on the covariance matrix or require Omega(d(3/2)) samples. Each of our estimators is based on a simple, general approach to designing differentially private mechanisms, but with novel technical steps to make the estimator private and sample-efficient. Our first estimator samples a point with approximately maximum Tukey depth using the exponential mechanism, but restricted to the set of points of large Tukey depth. Proving that this mechanism is private requires a novel analysis. Our second estimator perturbs the empirical mean of the data set with noise calibrated to the empirical covariance, without releasing the covariance itself. Its sample complexity guarantees hold more generally for subgaussian distributions, albeit with a slightly worse dependence on the privacy parameter. For both estimators, careful preprocessing of the data is required to satisfy differential privacy.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] Differentially Private Covariance Estimation
    Amin, Kareem
    Dick, Travis
    Kulesza, Alex
    Medina, Andres Mufioz
    Vassilvitskii, Sergei
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [2] DIFFERENTIALLY PRIVATE SPARSE INVERSE COVARIANCE ESTIMATION
    Wang, Di
    Huai, Mengdi
    Xu, Jinhui
    [J]. 2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 1139 - 1143
  • [3] Differentially private high dimensional sparse covariance matrix estimation
    Wang, Di
    Xu, Jinhui
    [J]. THEORETICAL COMPUTER SCIENCE, 2021, 865 : 119 - 130
  • [4] Agnostic Estimation of Mean and Covariance
    Lai, Kevin A.
    Rao, Anup B.
    Vempala, Santosh
    [J]. 2016 IEEE 57TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), 2016, : 665 - 674
  • [5] Lower Bound of Locally Differentially Private Sparse Covariance Matrix Estimation
    Wang, Di
    Xu, Jinhui
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4788 - 4794
  • [6] Mean and Covariance Estimation for Functional Snippets
    Lin, Zhenhua
    Wang, Jane-Ling
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2022, 117 (537) : 348 - 360
  • [7] Mean and covariance estimation of functional data streams
    Quan, Mingxue
    [J]. COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2024,
  • [8] Fast, Sample-Efficient, Affine-Invariant Private Mean and Covariance Estimation for Subgaussian Distributions Extended Abstract
    Brown, Gavin
    Hopkins, Samuel B.
    Smith, Adam
    [J]. THIRTY SIXTH ANNUAL CONFERENCE ON LEARNING THEORY, VOL 195, 2023, 195
  • [9] Differentially Private Covariance Revisited
    Dong, Wei
    Liang, Yuting
    Yi, Ke
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] Robust and differentially private mean estimation
    Liu, Xiyang
    Kong, Weihao
    Kakade, Sham
    Oh, Sewoong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34