Unexpected Information Leakage of Differential Privacy Due to the Linear Property of Queries

被引:5
|
作者
Huang, Wen [1 ]
Zhou, Shijie [1 ]
Liao, Yongjian [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Informat & Software Engn, Chengdu 610054, Peoples R China
关键词
Privacy; Differential privacy; Sensitivity; Correlation; Testing; National Institutes of Health; Switches; Laplace mechanism; membership inference attacks; differential privacy; linear property;
D O I
10.1109/TIFS.2021.3075843
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Differential privacy is a widely accepted concept of privacy preservation, and the Laplace mechanism is a famous instance of differentially private mechanisms used to deal with numerical data. In this paper, we find that differential privacy does not take the linear property of queries into account, resulting in unexpected information leakage. Specifically, the linear property makes it possible to divide one query into two queries, such as q(D) = q(D-1)+ q(D-2) if D = D-1 boolean OR D-2 and D-1 boolean OR D-2 = phi. If attackers try to obtain an answer to q(D), they can not only issue the query q(D) but also issue q(D-1) and calculate q(D-2) by themselves as long as they know D-2. Through different divisions of one query, attackers can obtain multiple different answers to the same query from differentially private mechanisms. However, from the attackers' perspective and differentially private mechanisms' perspective, the total consumed privacy budget is different if divisions are delicately designed. This difference leads to unexpected information leakage because the privacy budget is the key parameter for controlling the amount of information that is legally released from differentially private mechanisms. To demonstrate unexpected information leakage, we present a membership inference attack against the Laplace mechanism. Specifically, under the constraints of differential privacy, we propose a method for obtaining multiple independent identically distributed samples of answers to queries that satisfy the linear property. The proposed method is based on a linear property and some background knowledge of the attackers. When the background knowledge is sufficient, the proposed method can obtain a sufficient number of samples from differentially private mechanisms such that the total consumed privacy budget can be made unreasonably large. Based on the obtained samples, a hypothesis testing method is used to determine whether a target record is in a target dataset.
引用
收藏
页码:3123 / 3137
页数:15
相关论文
共 50 条
  • [41] SUMMABILITY METHOD DUE TO LINEAR-DIFFERENTIAL EQUATIONS AND A UNIQUENESS PROPERTY OF SOLUTIONS OF SINGULAR DIFFERENTIAL EQUATIONS
    GINGOLD, H
    PROCEEDINGS OF THE EDINBURGH MATHEMATICAL SOCIETY, 1976, 20 (MAR) : 41 - 51
  • [42] Linear Dependent Types for Differential Privacy
    Gaboardi, Marco
    Haeberlen, Andreas
    Hsu, Justin
    Narayan, Arjun
    Pierce, Benjamin C.
    ACM SIGPLAN NOTICES, 2013, 48 (01) : 357 - 370
  • [43] Information Leakage Metrics for Adversaries with Incomplete Information: Binary Privacy Mechanism
    Sakib, Shahnewaz Karim
    Amariucai, George T.
    Guan, Yong
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [44] Privacy Preserving Machine Learning with Limited Information Leakage
    Tang, Wenyi
    Qin, Bo
    Zhao, Suyun
    Zhao, Boning
    Xue, Yunzhi
    Chen, Hong
    NETWORK AND SYSTEM SECURITY, NSS 2019, 2019, 11928 : 352 - 370
  • [45] Contextual Linear Types for Differential Privacy
    Toro, Matias
    Darais, David
    Abuah, Chike
    Near, Joseph P.
    Arquez, Damian
    Olmedo, Federico
    Tanter, Eric
    ACM TRANSACTIONS ON PROGRAMMING LANGUAGES AND SYSTEMS, 2023, 45 (02):
  • [46] Distributed Linear Bandits With Differential Privacy
    Li, Fengjiao
    Zhou, Xingyu
    Ji, Bo
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (03): : 3161 - 3173
  • [47] Differential Privacy for Regularised Linear Regression
    Dandekar, Ashish
    Basu, Debabrota
    Bressan, Stephane
    DATABASE AND EXPERT SYSTEMS APPLICATIONS (DEXA 2018), PT II, 2018, 11030 : 483 - 491
  • [48] Tomorrow's privacy: personal information as property
    Rees, Christopher
    INTERNATIONAL DATA PRIVACY LAW, 2013, 3 (04) : 220 - 221
  • [49] Information ethics: Privacy, property, and power.
    Ennis, Lisa A.
    JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 2007, 58 (02): : 302 - 302
  • [50] Knowledge power: Intellectual property, information, and privacy
    Hosein, Ian
    INFORMATION SOCIETY, 2006, 22 (05): : 367 - 368