Training Generative Models From Privatized Data via Entropic Optimal Transport

被引:2
|
作者
Reshetova, Daria [1 ]
Chen, Wei-Ning [1 ]
Ozgur, Ayfer [1 ]
机构
[1] Stanford Univ, Dept Elect Engn, Stanford, CA 94205 USA
关键词
Privacy; GANs; entropic optimal transport;
D O I
10.1109/JSAIT.2024.3387463
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Local differential privacy is a powerful method for privacy-preserving data collection. In this paper, we develop a framework for training Generative Adversarial Networks (GANs) on differentially privatized data. We show that entropic regularization of optimal transport - a popular regularization method in the literature that has often been leveraged for its computational benefits - enables the generator to learn the raw (unprivatized) data distribution even though it only has access to privatized samples. We prove that at the same time this leads to fast statistical convergence at the parametric rate. This shows that entropic regularization of optimal transport uniquely enables the mitigation of both the effects of privatization noise and the curse of dimensionality in statistical convergence. We provide experimental evidence to support the efficacy of our framework in practice.
引用
收藏
页码:221 / 235
页数:15
相关论文
共 50 条
  • [21] PointOT: Interpretable Geometry-Inspired Point Cloud Generative Model via Optimal Transport
    Zhang, Ruonan
    Chen, Jingyi
    Gao, Wei
    Li, Ge
    Li, Thomas H.
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (10) : 6792 - 6806
  • [22] Learning Generative Models for Climbing Aircraft from Radar Data
    Pepper, Nick
    Thomas, Marc
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2024, 21 (06): : 474 - 481
  • [23] COT: A Generative Approach for Hate Speech Counter-Narratives via Contrastive Optimal Transport
    Zhang, Linhao
    Jin, Li
    Xu, Guangluan
    Li, Xiaoyu
    Sun, Xian
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2025, 9 (01): : 740 - 756
  • [24] Generative Shape Models: Joint Text Recognition and Segmentation with Very Little Training Data
    Lou, Xinghua
    Kansky, Ken
    Lehrach, Wolfgang
    Laan, C. C.
    Marthi, Bhaskara
    Phoenix, D. Scott
    George, Dileep
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [25] Enhancing Stability in Training Conditional Generative Adversarial Networks via Selective Data Matching
    Kong, Kyeongbo
    Kim, Kyunghun
    Kang, Suk-Ju
    IEEE ACCESS, 2024, 12 : 119647 - 119659
  • [26] Performance Scaling via Optimal Transport: Enabling Data Selection from Partially Revealed Sources
    Kang, Feiyang
    Just, Hoang Anh
    Sahu, Anit Kumar
    Jia, Ruoxi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [27] Extracting Training Data from Diffusion Models
    Carlini, Nicholas
    Hayes, Jamie
    Nasr, Milad
    Jagielski, Matthew
    Sehwag, Vikash
    Tramer, Florian
    Balle, Borja
    Ippolito, Daphne
    Wallace, Eric
    PROCEEDINGS OF THE 32ND USENIX SECURITY SYMPOSIUM, 2023, : 5253 - 5270
  • [28] Inference With Aggregate Data in Probabilistic Graphical Models: An Optimal Transport Approach
    Singh, Rahul
    Haasler, Isabel
    Zhang, Qinsheng
    Karlsson, Johan
    Chen, Yongxin
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (09) : 4483 - 4497
  • [29] Multimodal Data Fusion in High-Dimensional Heterogeneous Datasets Via Generative Models
    Yilmaz, Yasin
    Aktukmak, Mehmet
    Hero, Alfred
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2021, 69 : 5175 - 5188
  • [30] Multivariate Soft Rank via Entropy-Regularized Optimal Transport: Sample Efficiency and Generative Modeling
    Bin Masud, Shoaib
    Werenski, Matthew
    Murphy, James M.
    Aeron, Shuchin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24