BYZANTINE-ROBUST AGGREGATION WITH GRADIENT DIFFERENCE COMPRESSION AND STOCHASTIC VARIANCE REDUCTION FOR FEDERATED LEARNING

被引:0
|
作者
Zhu, Heng [1 ,2 ]
Ling, Qing [1 ]
机构
[1] Sun Yat Sen Univ, Guangzhou, Guangdong, Peoples R China
[2] Univ Calif San Diego, San Diego, CA 92103 USA
关键词
Federated learning; communication efficiency; Byzantine robustness; gradient compression; INTERNET;
D O I
10.1109/ICASSP43922.2022.9746746
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
We investigate the problem of Byzantine-robust compressed federated learning, where the transmissions from the workers to the master node are compressed, and subject to malicious attacks from an unknown number of Byzantine workers. We show that the vanilla combination of the distributed compressed stochastic gradient descent (SGD) with geometric median-based robust aggregation suffers from the compression noise under Byzantine attacks. In light of this observation, we propose to reduce the compression noise with gradient difference compression to improve the Byzantine-robustness. We also observe the impact of the intrinsic stochastic noise from selecting random samples, and adopt the stochastic average gradient algorithm (SAGA) to gradually eliminate the inner variations of regular workers. We prove that the proposed algorithm reaches a neighborhood of the optimal solution at a linear convergence rate, and the asymptotic learning error is in the same order as that of the state-of-the-art uncompressed method. Finally, numerical experiments demonstrate the effectiveness of the proposed method.
引用
收藏
页码:4278 / 4282
页数:5
相关论文
共 50 条
  • [21] BOBA: Byzantine-Robust Federated Learning with Label Skewness
    Bao, Wenxuan
    Wu, Jun
    He, Jingrui
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [22] STOCHASTIC ADMM FOR BYZANTINE-ROBUST DISTRIBUTED LEARNING
    Lin, Feng
    Ling, Qing
    Li, Weiyu
    Xiong, Zhiwei
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 3172 - 3176
  • [23] Byzantine-Robust Federated Linear Bandits
    Jadbabaie, Ali
    Li, Haochuan
    Qian, Jian
    Tian, Yi
    [J]. 2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 5206 - 5213
  • [24] FedCom: Byzantine-Robust Federated Learning Using Data Commitment
    Zhao, Bo
    Wang, Tao
    Fang, Liming
    [J]. ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 33 - 38
  • [25] Byzantine-robust decentralized stochastic optimization with stochastic gradient noise-independent learning error
    Peng, Jie
    Li, Weiyu
    Ling, Qing
    [J]. SIGNAL PROCESSING, 2024, 219
  • [26] Local Model Poisoning Attacks to Byzantine-Robust Federated Learning
    Fang, Minghong
    Cao, Xiaoyu
    Jia, Jinyuan
    Gong, Neil Nenqiang
    [J]. PROCEEDINGS OF THE 29TH USENIX SECURITY SYMPOSIUM, 2020, : 1623 - 1640
  • [27] SIREN: Byzantine-robust Federated Learning via Proactive Alarming
    Guo, Hanxi
    Wang, Hao
    Song, Tao
    Hua, Yang
    Lv, Zhangcheng
    Jin, Xiulang
    Xue, Zhengui
    Ma, Ruhui
    Guan, Haibing
    [J]. PROCEEDINGS OF THE 2021 ACM SYMPOSIUM ON CLOUD COMPUTING (SOCC '21), 2021, : 47 - 60
  • [28] Efficient and Privacy-Preserving Byzantine-robust Federated Learning
    Luan, Shijie
    Lu, Xiang
    Zhang, Zhuangzhuang
    Chang, Guangsheng
    Guo, Yunchuan
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2202 - 2208
  • [29] FLTrust: Byzantine-robust Federated Learning via Trust Bootstrapping
    Cao, Xiaoyu
    Fang, Minghong
    Liu, Jia
    Gong, Neil Zhenqiang
    [J]. 28TH ANNUAL NETWORK AND DISTRIBUTED SYSTEM SECURITY SYMPOSIUM (NDSS 2021), 2021,
  • [30] FLForest: Byzantine-robust Federated Learning through Isolated Forest
    Wang, Tao
    Zhao, Bo
    Fang, Liming
    [J]. 2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 296 - 303