Locally Differentially-Private Distribution Estimation

被引:0
|
作者
Pastore, Adrian [1 ]
Gastpar, Michael [1 ]
机构
[1] Ecole Polytech Fed Lausanne, Inst Commun Syst, LINX Lab, CH-1015 Lausanne, Switzerland
关键词
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We consider a setup in which confidential i.i.d. samples X-1,X- . . . , X-n from an unknown discrete distribution P-X are passed through a discrete memoryless privatization channel (a.k.a. mechanism) which guarantees an epsilon-level of local differential privacy. For a given epsilon, the channel should be designed such that an estimate of the source distribution based on the channel outputs converges as fast as possible to the exact value P-X. For this purpose we consider two metrics of estimation accuracy: the expected mean-square error and the expected Kullback-Leibler divergence. We derive their respective normalized first-order terms (as n -> infinity), which for a given target privacy epsilon represent the factor by which the sample size must be augmented so as to achieve the same estimation accuracy as that of an identity (non-privatizing) channel. We formulate the privacy-utility tradeoff problem as being that of minimizing said first-order term under a privacy constraint epsilon. A converse bound is stated which bounds the optimal tradeoff away from the origin. Inspired by recent work on the optimality of staircase mechanisms (albeit for objectives different from ours), we derive an achievable tradeoff based on circulant step mechanisms. Within this finite class, we determine the optimal step pattern.
引用
收藏
页码:2694 / 2698
页数:5
相关论文
共 50 条
  • [1] Locally Differentially-Private Randomized Response for Discrete Distribution Learning
    Pastore, Adriano
    Gastpar, Michael
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [2] Locally differentially-private randomized response for discrete distribution learning
    Pastore, Adriano
    Gastpar, Michael
    [J]. Journal of Machine Learning Research, 2021, 22 : 1 - 56
  • [3] Distributionally-robust machine learning using locally differentially-private data
    Farhad Farokhi
    [J]. Optimization Letters, 2022, 16 : 1167 - 1179
  • [4] Locally differentially private frequency distribution estimation with relative error optimization
    Wang, Ning
    Liu, Yifei
    Wang, Zhigang
    Wei, Zhiqiang
    Tang, Ruichun
    Tang, Peng
    Yu, Ge
    [J]. FRONTIERS OF COMPUTER SCIENCE, 2024, 18 (05)
  • [5] Distributionally-robust machine learning using locally differentially-private data
    Farokhi, Farhad
    [J]. OPTIMIZATION LETTERS, 2022, 16 (04) : 1167 - 1179
  • [6] Towards Verifiable Differentially-Private Polling
    Garrido, Gonzalo Munilla
    Babel, Matthias
    Sedlmeir, Johannes
    [J]. PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON AVAILABILITY, RELIABILITY AND SECURITY, ARES 2022, 2022,
  • [7] Differentially-Private Network Trace Analysis
    McSherry, Frank
    Mahajan, Ratul
    [J]. ACM SIGCOMM COMPUTER COMMUNICATION REVIEW, 2010, 40 (04) : 123 - 134
  • [8] DIFFERENTIALLY-PRIVATE CANONICAL CORRELATION ANALYSIS
    Imtiaz, Hafiz
    Sarwate, Anand D.
    [J]. 2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 283 - 287
  • [9] Differentially-Private Clustering of Easy Instances
    Cohen, Edith
    Kaplan, Haim
    Mansour, Yishay
    Stemmer, Uri
    Tsfadia, Eliad
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [10] On the information leakage of differentially-private mechanisms
    Alvim, Mario S.
    Andres, Miguel E.
    Chatzikokolakis, Konstantinos
    Degano, Pierpaolo
    Palamidessi, Catuscia
    [J]. JOURNAL OF COMPUTER SECURITY, 2015, 23 (04) : 427 - 469