Model compression and privacy preserving framework for federated learning

被引:12
|
作者
Zhu, Xi [1 ]
Wang, Junbo [2 ]
Chen, Wuhui [3 ]
Sato, Kento [4 ]
机构
[1] Sun Yat Sen Univ, Sch Syst Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Shenzhen 518107, Peoples R China
[3] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[4] RIKEN, Ctr Computat Sci, Kobe 6500047, Japan
关键词
Federated learning; Privacy preserving; Model compression; Convolutional neural networks; EFFICIENT; SYSTEM;
D O I
10.1016/j.future.2022.10.026
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated learning (FL) as a collaborative learning paradigm has attracted extensive attention due to its characteristic of privacy preserving, in which the clients train a shared neural network model collaboratively by their local dataset and upload their model parameters merely instead of original data by wireless network in the whole training process. Because FL reduces transmission significantly, it can further meets the efficiency and security of the next generation wireless system. Although FL has reduced the size of information that needs to be transmitted, the update of model parameters still suffers from privacy leakage and communication bottleneck especially in wireless networks. To address the problem of privacy and communication, this paper proposes a model compression based FL framework. Firstly, the designed model compression framework provides effective support for efficient and secure model parameters updating in FL while keeping the personalization of all clients. Then, the proposed perturbed model compression method can further reduce the size of the model and protect the privacy of the model without sacrificing much accuracy. Besides, it also facilitates the simultaneous execution of decryption and decompressing operations by reconstruction algorithm on encrypted and compressed model parameters which is obtained by the proposed perturbed model compression method. Finally, the illustrative results demonstrate that the proposed model compression based FL framework can significantly reduce the number of model parameters for uploading with a strong privacy preservation property. For example, when the compression ratio is 0.0953 (i.e., only 9.53% of the parameters are uploaded), the accuracy of MNIST achieves 97% while the accuracy without compression is 98%. (c) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:376 / 389
页数:14
相关论文
共 50 条
  • [1] PPFed: A Privacy-Preserving and Personalized Federated Learning Framework
    Zhang, Guangsheng
    Liu, Bo
    Zhu, Tianqing
    Ding, Ming
    Zhou, Wanlei
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (11): : 19380 - 19393
  • [2] Fedlabx: a practical and privacy-preserving framework for federated learning
    Yuping Yan
    Mohammed B. M. Kamel
    Marcell Zoltay
    Marcell Gál
    Roland Hollós
    Yaochu Jin
    Ligeti Péter
    Ákos Tényi
    [J]. Complex & Intelligent Systems, 2024, 10 : 677 - 690
  • [3] Fedlabx: a practical and privacy-preserving framework for federated learning
    Yan, Yuping
    Kamel, Mohammed B. M.
    Zoltay, Marcell
    Gal, Marcell
    Hollos, Roland
    Jin, Yaochu
    Peter, Ligeti
    Tenyi, Akos
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (01) : 677 - 690
  • [4] A privacy preserving framework for federated learning in smart healthcare systems
    Wang, Wenshuo
    Li, Xu
    Qiu, Xiuqin
    Zhang, Xiang
    Brusic, Vladimir
    Zhao, Jindong
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2023, 60 (01)
  • [5] A privacy-preserving federated learning framework for blockchain networks
    Abuzied, Youssif
    Ghanem, Mohamed
    Dawoud, Fadi
    Gamal, Habiba
    Soliman, Eslam
    Sharara, Hossam
    Elbatt, Tamer
    [J]. CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2024, 27 (04): : 3997 - 4014
  • [6] A Verifiable and Privacy-Preserving Federated Learning Training Framework
    Duan, Haohua
    Peng, Zedong
    Xiang, Liyao
    Hu, Yuncong
    Li, Bo
    [J]. IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 5046 - 5058
  • [7] Learned Parameter Compression for Efficient and Privacy-Preserving Federated Learning
    Chen, Yiming
    Abrahamyan, Lusine
    Sahli, Hichem
    Deligiannis, Nikos
    [J]. IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 3503 - 3516
  • [8] PFLF: Privacy-Preserving Federated Learning Framework for Edge Computing
    Zhou, Hao
    Yang, Geng
    Dai, Hua
    Liu, Guoxiu
    [J]. IEEE TRANSACTIONS ON INFORMATION FORENSICS AND SECURITY, 2022, 17 : 1905 - 1918
  • [9] A Privacy-preserving Data Alignment Framework for Vertical Federated Learning
    Gao, Ying
    Xie, Yuxin
    Deng, Huanghao
    Zhu, Zukun
    Zhang, Yiyu
    [J]. Dianzi Yu Xinxi Xuebao/Journal of Electronics and Information Technology, 2024, 46 (08): : 3419 - 3427
  • [10] Privacy-preserving federated learning framework in multimedia courses recommendation
    Qin, YangJie
    Li, Ming
    Zhu, Jia
    [J]. WIRELESS NETWORKS, 2023, 29 (04) : 1535 - 1544