Practical Challenges in Differentially-Private Federated Survival Analysis of Medical Data

被引:0
|
作者
Rahimian, Shadi [1 ]
Kerkouche, Raouf [1 ]
Kurth, Ina [2 ]
Fritz, Mario [1 ]
机构
[1] CISPA Helmholtz Ctr Informat Secur, Saarbrucken, Germany
[2] DKFZ German Canc Res Ctr, Heidelberg, Germany
关键词
BREAST-CANCER; PROGNOSIS; MODEL; PREDICTION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Survival analysis or time-to-event analysis aims to model and predict the time it takes for an event of interest to happen in a population or an individual. In the medical context this event might be the time of dying, metastasis, recurrence of cancer, etc. Recently, the use of neural networks that are specifically designed for survival analysis has become more popular and an attractive alternative to more traditional methods. In this paper, we take advantage of the inherent properties of neural networks to federate the process of training of these models. This is crucial in the medical domain since data is scarce and collaboration of multiple health centers is essential to make a conclusive decision about the properties of a treatment or a disease. To ensure the privacy of the datasets, it is common to utilize differential privacy on top of federated learning. Differential privacy acts by introducing random noise to different stages of training, thus making it harder for an adversary to extract details about the data. However, in the realistic setting of small medical datasets and only a few data centers, this noise makes it harder for the models to converge. To address this problem, we propose DPFed-post which adds a post-processing stage to the private federated learning scheme. This extra step helps to regulate the magnitude of the noisy average parameter update and easier convergence of the model. For our experiments, we choose 3 real-world datasets in the realistic setting when each health center has only a few hundred records, and we show that DPFed-post successfully increases the performance of the models by an average of up to 17% compared to the standard differentially private federated learning scheme.
引用
收藏
页码:411 / 425
页数:15
相关论文
共 50 条
  • [1] Differentially-Private Network Trace Analysis
    McSherry, Frank
    Mahajan, Ratul
    [J]. ACM SIGCOMM COMPUTER COMMUNICATION REVIEW, 2010, 40 (04) : 123 - 134
  • [2] DIFFERENTIALLY-PRIVATE CANONICAL CORRELATION ANALYSIS
    Imtiaz, Hafiz
    Sarwate, Anand D.
    [J]. 2017 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2017), 2017, : 283 - 287
  • [3] DISTRIBUTED DIFFERENTIALLY-PRIVATE CANONICAL CORRELATION ANALYSIS
    Imtiaz, Hafiz
    Sarwate, Anand D.
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3112 - 3116
  • [4] Differentially-Private Software Analytics for Mobile Apps: Opportunities and Challenges
    Zhang, Hailong
    Latif, Sufian
    Bassily, Raef
    Rountev, Atanas
    [J]. PROCEEDINGS OF THE 4TH ACM SIGSOFT INTERNATIONAL WORKSHOP ON SOFTWARE ANALYTICS (SWAN'18), 2018, : 26 - 29
  • [5] Shade: A Differentially-Private Wrapper For Enterprise Big Data
    Heifetz, Alexander
    Mugunthan, Vaikkunth
    Kagal, Lalana
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2017, : 1033 - 1042
  • [6] Incremental release of differentially-private check-in data
    Riboni, Daniele
    Bettini, Claudio
    [J]. PERVASIVE AND MOBILE COMPUTING, 2015, 16 : 220 - 238
  • [7] Market Value of Differentially-Private Smart Meter Data
    Chhachhi, Saurab
    Teng, Fei
    [J]. 2021 IEEE POWER & ENERGY SOCIETY INNOVATIVE SMART GRID TECHNOLOGIES CONFERENCE (ISGT), 2021,
  • [8] Assessing Wearable Human Activity Recognition Systems Against Data Poisoning Attacks in Differentially-Private Federated Learning
    Shahid, Abdur R.
    Imteaj, Ahmed
    Badsha, Shahriar
    Hossain, Md Zarif
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON SMART COMPUTING, SMARTCOMP, 2023, : 355 - 360
  • [9] Differentially-Private Release of Check-in Data for Venue Recommendation
    Riboni, Daniele
    Bettini, Claudio
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS (PERCOM), 2014, : 190 - 198
  • [10] Data Poisoning against Differentially-Private Learners: Attacks and Defenses
    Ma, Yuzhe
    Zhu, Xiaojin
    Hsu, Justin
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 4732 - 4738