Fast and Accurate Deep Leakage from Gradients Based on Wasserstein Distance

被引:5
|
作者
He, Xing [1 ,2 ]
Peng, Changgen [1 ,3 ]
Tan, Weijie [1 ,3 ,4 ]
机构
[1] Guizhou Univ, Coll Comp Sci & Technol, State Key Lab Publ Big Data, Guiyang 550025, Peoples R China
[2] Guizhou Minzu Univ, Guiyang 550025, Peoples R China
[3] Guizhou Univ, Guizhou Big Data Acad, Guiyang 550025, Peoples R China
[4] Guizhou Univ, Key Lab Adv Mfg Technol, Minist Educ, Guiyang 550025, Peoples R China
基金
中国国家自然科学基金;
关键词
NEURAL-NETWORKS;
D O I
10.1155/2023/5510329
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Shared gradients are widely used to protect the private information of training data in distributed machine learning systems. However, Deep Leakage from Gradients (DLG) research has found that private training data can be recovered from shared gradients. The DLG method still has some issues such as the "Exploding Gradient," low attack success rate, and low fidelity of recovered data. In this study, a Wasserstein DLG method, named WDLG, is proposed; the theoretical analysis shows that under the premise that the output layer of the model has a "bias" term, predicting the "label" of the data by whether the "bias" is "negative" or not is independent of the approximation of the shared gradient, and thus, the label of the data can be recovered with 100% accuracy. In the proposed method, the Wasserstein distance is used to calculate the error loss between the shared gradient and the virtual gradient, which improves model training stability, solves the "Exploding Gradient" phenomenon, and improves the fidelity of the recovered data. Moreover, a large learning rate strategy is designed to improve model training convergence speed in-depth. Finally, the WDLG method is validated on datasets from MNIST, Fashion MNIST, SVHN, CIFAR-100, and LFW. Experiments results show that the proposed WDLG method provides more stable updates for virtual data, a higher attack success rate, faster model convergence, higher image fidelity during recovery, and support for designing large learning rate strategies.
引用
收藏
页数:12
相关论文
共 50 条
  • [41] An Accurate and Fast Technique for Correcting Spectral Leakage in Motor Diagnosis
    Martinez, Javier
    Philipp, Francois
    Glesner, Manfred
    Arkkio, Antero
    2013 9TH IEEE INTERNATIONAL SYMPOSIUM ON DIAGNOSTICS FOR ELECTRIC MACHINES, POWER ELECTRONICS AND DRIVES (SDEMPED), 2013, : 215 - 220
  • [42] FastME 2.0: A Comprehensive, Accurate, and Fast Distance-Based Phylogeny Inference Program
    Lefort, Vincent
    Desper, Richard
    Gascuel, Olivier
    MOLECULAR BIOLOGY AND EVOLUTION, 2015, 32 (10) : 2798 - 2800
  • [43] Fast and accurate geodesic distance transform by ordered propagation
    Cardenes, Ruben
    Alberola-Lopez, Carlos
    Ruiz-Alzola, Juan
    IMAGE AND VISION COMPUTING, 2010, 28 (03) : 307 - 316
  • [44] A Novel Graph Kernel Based on the Wasserstein Distance and Spectral Signatures
    Liu, Yantao
    Rossi, Luca
    Torsello, Andrea
    STRUCTURAL, SYNTACTIC, AND STATISTICAL PATTERN RECOGNITION, S+SSPR 2022, 2022, 13813 : 122 - 131
  • [45] Hyperspectral Anomaly Detection Based on Wasserstein Distance and Spatial Filtering
    Cheng, Xiaoyu
    Wen, Maoxing
    Gao, Cong
    Wang, Yueming
    REMOTE SENSING, 2022, 14 (12)
  • [46] Target detection based on generalized Bures-Wasserstein distance
    Huang, Zhizhong
    Zheng, Lin
    EURASIP JOURNAL ON ADVANCES IN SIGNAL PROCESSING, 2023, 2023 (01)
  • [47] Quantum Wasserstein distance based on an optimization over separable states
    Toth, Geza
    Pitrik, Jozsef
    QUANTUM, 2023, 7
  • [48] Wasserstein Distance-Based Auto-Encoder Tracking
    Xu, Long
    Wei, Ying
    Dong, Chenhe
    Xu, Chuaqiao
    Diao, Zhaofu
    NEURAL PROCESSING LETTERS, 2021, 53 (03) : 2305 - 2329
  • [49] Optimal energy scheduling for microgrid based on GAIL with Wasserstein distance
    Wang, Kuo
    Zhang, Zhanqiang
    Meng, Keqilao
    Lei, Pengbing
    Wang, Rui
    Yang, Wenlu
    Lin, Zhihua
    AIP ADVANCES, 2024, 14 (08)
  • [50] Wasserstein Distance-Based Auto-Encoder Tracking
    Long Xu
    Ying Wei
    Chenhe Dong
    Chuaqiao Xu
    Zhaofu Diao
    Neural Processing Letters, 2021, 53 : 2305 - 2329