Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing

被引:0
|
作者
Chen L. [1 ,2 ]
Xiao D. [1 ,2 ]
Yu Z. [1 ,2 ]
Huang H. [1 ,2 ]
Li M. [1 ,2 ]
机构
[1] College of Computer Science, Chongqing University, Chongqing
[2] Key Laboratory of Dependable Service Computing in Cyber Physical Society(Chongqing University), Ministry of Education, Chongqing
基金
中国国家自然科学基金;
关键词
Communication-efficient; Compressed sensing; Federated learning; Resource-constrained; Security multi-party computation;
D O I
10.7544/issn1000-1239.20220526
中图分类号
学科分类号
摘要
The rapid development of deep learning technology has brought us great convenience, but it also results in the disclosure of a large number of private data. Federated learning (FL) allows clients to jointly train models only by sharing gradients, which seems to solve the privacy information leakage problem, but some research show that gradients transmitted in FL frameworks can still result in privacy information leakage. Moreover, the high communication cost of FL is difficult to apply to resource-constrained environments. Therefore, we put forward two communication-efficient and secure FL algorithms, which use Top-K sparse and compressed sensing to reduce communication overhead caused by the gradient transmission, and further use additive secret sharing in secure multi-party computation (MPC) to encrypt the important gradient parameter measurements in order to simultaneously realize communication overhead reduction and security enhancement. The main difference between the two algorithms is that the client and server communicate with each other by transmitting the gradient measurement value and the quantization result of the gradient measurement value, respectively. Experiments on MNIST and Fashion-MNIST data show that, compared with other algorithms, the proposed algorithms can further increase the security at a lower communication cost and have better performance in model accuracy. © 2022, Science Press. All right reserved.
引用
收藏
页码:2395 / 2407
页数:12
相关论文
共 44 条
  • [1] Hao Fei, Min Geyong, Chen Jinjun, Et al., An optimized computational model for multi-community-cloud social collaboration, IEEE Transactions on Services Computing, 7, 3, pp. 346-358, (2014)
  • [2] Qian Wenjun, Sheng Qingni, Wu Pengfei, Et al., Research progress on privacy-preserving techniques in big data computing environment, Chinese Journal of Computers, 45, 4, pp. 669-701, (2022)
  • [3] Savazzi S, Nicoli M, Rampa V., Federated learning with cooperating devices: A consensus approach for massive IoT networks, IEEE Internet of Things Journal, 7, 5, pp. 4641-4654, (2020)
  • [4] Loghin D, Cai Shaofeng, Chen Gang, Et al., The disruptions of 5G on data-driven technologies and applications, IEEE Transactions on Knowledge and Data Engineering, 32, 6, pp. 1179-1198, (2020)
  • [5] Wang Shiqiang, Tuor T, Salonidis T, Et al., Adaptive federated learning in resource constrained edge computing systems, IEEE Journal on Selected Areas in Communications, 37, 6, pp. 1205-1221, (2019)
  • [6] McMahan B, Ramage D., Federated learning: Collaborative machine learning without centralized training data[EB/OL], Google Research Blog, (2017)
  • [7] Zhu Ligeng, Liu Zhijian, Han Song, Deep leakage from gradients[J], Advances in Neural Information Processing Systems, (2019)
  • [8] Zhao Bo, Mopuri K, Bilen H., iDLG: Improved deep leakage from gradients, (2020)
  • [9] McMahan B, Moore E, Ramage D, Et al., Communication-efficient learning of deep networks from decentralized data, Proc of Artificial Intelligence and Statistics, pp. 1273-1282, (2017)
  • [10] Li Chengxi, Li Gang, Varshney P., Communication-efficient federated learning based on compressed sensing, IEEE Internet of Things Journal, 8, 20, pp. 15531-15541, (2021)