Preserving Data Utility in Differentially Private Smart Home Data

被引:0
|
作者
Stirapongsasuti, Sopicha [1 ]
Tiausas, Francis Jerome [1 ]
Nakamura, Yugo [2 ]
Yasumoto, Keiichi [1 ,3 ]
机构
[1] Nara Inst Sci & Technol, Ikoma, Nara 6300192, Japan
[2] Kyushu Univ, Dept Informat Sci & Elect Engn, Fukuoka 8190395, Japan
[3] RIKEN, Ctr Adv Intelligence Project AIP, Tokyo 1030027, Japan
关键词
Differential privacy; machine learning; privacy; smart home; PRESERVATION; EFFICIENT; SYSTEM; CARE;
D O I
10.1109/ACCESS.2024.3390039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The development of smart sensors and appliances can provide a lot of services. Nevertheless, the act of aggregating data containing sensitive information related to privacy in a single location poses significant issues. Such information can be misused by a malicious attacker. Also, some previous studies attempted to apply privacy mechanisms, but they decreased data utility. In this paper, we propose privacy protection mechanisms to preserve privacy-sensitive sensor data generated in a smart home. We leverage R & eacute;nyi differential privacy (RDP) to preserve privacy. However, the preliminary result showed that using only RDP still significantly decreases the utility of data. Thus, a novel scheme called feature merging anonymization (FMA) is proposed to preserve privacy while maintaining data utility by merging feature dataframes of the same activities from other homes. Also, the expected trade-off is defined so that data utility should be greater than the privacy preserved. To evaluate the proposed techniques, we define privacy preservation and data utility as inverse accuracy of person identification (PI) and accuracy of activity recognition (AR), respectively. We trained the AR and PI models for two cases with and without FMA, using 2 smart-home open datasets i.e. the HIS and Toyota dataset. As a result, we could lower the accuracy of PI in the HIS and Toyota dataset to 73.85% and 41.18% with FMA respectively compared to 100% without FMA, while maintaining the accuracy of AR at 94.62% and 87.3% with FMA compared to 98.58% and 89.28% without FMA in the HIS and Toyota dataset, respectively. Another experiment was conducted to explore the feasibility of implementing FMA in a local server by partially merging frames of the original activity with frames of other activities at different merging ratios. The results show that the local server can still satisfy the expected trade-off at some ratios.
引用
收藏
页码:56571 / 56581
页数:11
相关论文
共 50 条
  • [21] Utility Aware Optimal Data Selection for Differentially Private Federated Learning in IoV
    Zhang J.
    Li S.
    Wang C.
    IEEE Internet of Things Journal, 2024, 11 (20) : 1 - 1
  • [22] Improving the Utility of Differentially Private Data Releases via k-Anonymity
    Soria-Comas, Jordi
    Domingo-Ferrer, Josep
    Sanchez, David
    Martinez, Sergio
    2013 12TH IEEE INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2013), 2013, : 372 - 379
  • [23] Differentially Private Auctions for Private Data Crowdsourcing
    Shi, Mingyu
    Qiao, Yu
    Wang, Xinbo
    2019 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2019), 2019, : 1 - 8
  • [24] A Differentially Private Big Data Nonparametric Bayesian Clustering Algorithm in Smart Grid
    Guan, Zhitao
    Lv, Zefang
    Sun, Xianwen
    Wu, Longfei
    Wu, Jun
    Du, Xiaojiang
    Guizani, Mohsen
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2020, 7 (04): : 2631 - 2641
  • [25] A New Differentially Private Data Aggregation With Fault Tolerance for Smart Grid Communications
    Bao, Haiyong
    Lu, Rongxing
    IEEE INTERNET OF THINGS JOURNAL, 2015, 2 (03): : 248 - 258
  • [26] Privacy-Preserving Student Learning with Differentially Private Data-Free Distillation
    Liu, Bochao
    Lu, Jianghu
    Wang, Pengju
    Zhang, Junjie
    Zeng, Dan
    Qian, Zhenxing
    Ge, Shiming
    2022 IEEE 24TH INTERNATIONAL WORKSHOP ON MULTIMEDIA SIGNAL PROCESSING (MMSP), 2022,
  • [27] Differentially private data publishing for arbitrarily partitioned data
    Wang, Rong
    Fung, Benjamin C. M.
    Zhu, Yan
    Peng, Qiang
    INFORMATION SCIENCES, 2021, 553 : 247 - 265
  • [28] Adaptive Differentially Private Data Release for Data Sharing and Data Mining
    Xiong, Li
    2013 IEEE 13TH INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2013, : 891 - 891
  • [29] PrivSyn: Differentially Private Data Synthesis
    Zhang, Zhikun
    Wang, Tianhao
    Li, Ninghui
    Honorio, Jean
    Backes, Michael
    He, Shibo
    Chen, Jiming
    Zhang, Yang
    PROCEEDINGS OF THE 30TH USENIX SECURITY SYMPOSIUM, 2021, : 929 - 946
  • [30] Differentially Private Multidimensional Data Publication
    Zhang Ji
    Dong Xin
    Yu Jiadi
    Luo Yuan
    Li Minglu
    Wu Bin
    CHINA COMMUNICATIONS, 2014, 11 (01) : 79 - 85