Preserving Data Utility in Differentially Private Smart Home Data

被引:0
|
作者
Stirapongsasuti, Sopicha [1 ]
Tiausas, Francis Jerome [1 ]
Nakamura, Yugo [2 ]
Yasumoto, Keiichi [1 ,3 ]
机构
[1] Nara Inst Sci & Technol, Ikoma, Nara 6300192, Japan
[2] Kyushu Univ, Dept Informat Sci & Elect Engn, Fukuoka 8190395, Japan
[3] RIKEN, Ctr Adv Intelligence Project AIP, Tokyo 1030027, Japan
关键词
Differential privacy; machine learning; privacy; smart home; PRESERVATION; EFFICIENT; SYSTEM; CARE;
D O I
10.1109/ACCESS.2024.3390039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The development of smart sensors and appliances can provide a lot of services. Nevertheless, the act of aggregating data containing sensitive information related to privacy in a single location poses significant issues. Such information can be misused by a malicious attacker. Also, some previous studies attempted to apply privacy mechanisms, but they decreased data utility. In this paper, we propose privacy protection mechanisms to preserve privacy-sensitive sensor data generated in a smart home. We leverage R & eacute;nyi differential privacy (RDP) to preserve privacy. However, the preliminary result showed that using only RDP still significantly decreases the utility of data. Thus, a novel scheme called feature merging anonymization (FMA) is proposed to preserve privacy while maintaining data utility by merging feature dataframes of the same activities from other homes. Also, the expected trade-off is defined so that data utility should be greater than the privacy preserved. To evaluate the proposed techniques, we define privacy preservation and data utility as inverse accuracy of person identification (PI) and accuracy of activity recognition (AR), respectively. We trained the AR and PI models for two cases with and without FMA, using 2 smart-home open datasets i.e. the HIS and Toyota dataset. As a result, we could lower the accuracy of PI in the HIS and Toyota dataset to 73.85% and 41.18% with FMA respectively compared to 100% without FMA, while maintaining the accuracy of AR at 94.62% and 87.3% with FMA compared to 98.58% and 89.28% without FMA in the HIS and Toyota dataset, respectively. Another experiment was conducted to explore the feasibility of implementing FMA in a local server by partially merging frames of the original activity with frames of other activities at different merging ratios. The results show that the local server can still satisfy the expected trade-off at some ratios.
引用
收藏
页码:56571 / 56581
页数:11
相关论文
共 50 条
  • [1] Differentially Private and Utility Preserving Publication of Trajectory Data
    Gursoy, Mehmet Emre
    Liu, Ling
    Truex, Stacey
    Yu, Lei
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2019, 18 (10) : 2315 - 2329
  • [2] Utility-preserving differentially private data releases via individual ranking microaggregation
    Sanchez, David
    Domingo-Ferrer, Josep
    Martinez, Sergio
    Soria-Comas, Jordi
    INFORMATION FUSION, 2016, 30 : 1 - 14
  • [3] Differentially private data publication with multi -level data utility
    Jiang, Honglu
    Sarwar, S. M.
    Yu, Haotian
    Islam, Sheikh Ariful
    HIGH-CONFIDENCE COMPUTING, 2022, 2 (02):
  • [4] DPSynthesizer: Differentially Private Data Synthesizer for Privacy Preserving Data Sharing
    Li, Haoran
    Xiong, Li
    Zhang, Lifan
    Jiang, Xiaoqian
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2014, 7 (13): : 1677 - 1680
  • [5] Privacy preserving classification over differentially private data
    Zorarpaci, Ezgi
    Ozel, Selma Ayse
    WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2021, 11 (03)
  • [6] Survey on Improving Data Utility in Differentially Private Sequential Data Publishing
    Yang, Xinyu
    Wang, Teng
    Ren, Xuebin
    Yu, Wei
    IEEE TRANSACTIONS ON BIG DATA, 2021, 7 (04) : 729 - 749
  • [7] Privacy-Preserving Fog Aggregation of Smart Grid Data Using Dynamic Differentially-Private Data Perturbation
    Kserawi, Fawaz
    Al-Marri, Saeed
    Malluhi, Qutaibah
    IEEE ACCESS, 2022, 10 : 43159 - 43174
  • [8] Releasing Differentially Private Trajectories with Optimized Data Utility
    Li, Bing
    Zhu, Hong
    Xie, Meiyi
    APPLIED SCIENCES-BASEL, 2022, 12 (05):
  • [9] Privacy-Preserving Utility Verification of the Data Published by Non-Interactive Differentially Private Mechanisms
    Hua, Jingyu
    Tang, An
    Fang, Yixin
    Shen, Zhenyu
    Zhong, Sheng
    IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2016, 11 (10) : 2298 - 2311
  • [10] Differentially private and utility-aware publication of trajectory data
    Liu, Qi
    Yu, Juan
    Han, Jianmin
    Yao, Xin
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 180