Communication-Efficient Vertical Federated Learning via Compressed Error Feedback

被引:0
|
作者
Valdeira, Pedro [1 ,2 ,3 ]
Xavier, Joao [2 ]
Soares, Claudia [4 ]
Chi, Yuejie [1 ]
机构
[1] Carnegie Mellon Univ, Dept Elect & Comp Engn, Pittsburgh, PA 15213 USA
[2] Univ Lisbon, Inst Super Tecn, P-1049001 Lisbon, Portugal
[3] Inst Syst & Robot, Lab Robot & Engn Syst, P-1600011 Lisbon, Portugal
[4] Univ Nova Lisboa, NOVA Sch Sci & Technol, Dept Comp Sci, P-2829516 Caparica, Portugal
基金
美国国家科学基金会;
关键词
Servers; Compressors; Training; Convergence; Vectors; Federated learning; Receivers; Optimization methods; Electronic mail; Data models; Vertical federated learning; nonconvex optimization; communication-compressed optimization; QUANTIZATION;
D O I
10.1109/TSP.2025.3540655
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Communication overhead is a known bottleneck in federated learning (FL). To address this, lossy compression is commonly used on the information communicated between the server and clients during training. In horizontal FL, where each client holds a subset of the samples, such communication-compressed training methods have recently seen significant progress. However, in their vertical FL counterparts, where each client holds a subset of the features, our understanding remains limited. To address this, we propose an error feedback compressed vertical federated learning (EF-VFL) method to train split neural networks. In contrast to previous communication-compressed methods for vertical FL, EF-VFL does not require a vanishing compression error for the gradient norm to converge to zero for smooth nonconvex problems. By leveraging error feedback, our method can achieve a O(1/T) convergence rate for a sufficiently large batch size, improving over the state-of-the-art O(1/root T) rate under O(1/root T) compression error, and matching the rate of uncompressed methods. Further, when the objective function satisfies the Polyak-& Lstrok;ojasiewicz inequality, our method converges linearly. In addition to improving convergence, our method also supports the use of private labels. Numerical experiments show that EF-VFL significantly improves over the prior art, confirming our theoretical results.
引用
收藏
页码:1065 / 1080
页数:16
相关论文
共 50 条
  • [31] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [32] Communication-Efficient Federated Learning for Decision Trees
    Zhao, Shuo
    Zhu, Zikun
    Li, Xin
    Chen, Ying-Chi
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 5478 - 5492
  • [33] Communication-Efficient Federated Learning with Adaptive Quantization
    Mao, Yuzhu
    Zhao, Zihao
    Yan, Guangfeng
    Liu, Yang
    Lan, Tian
    Song, Linqi
    Ding, Wenbo
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [34] Communication-Efficient and Byzantine-Robust Distributed Learning with Error Feedback
    Ghosh A.
    Maity R.K.
    Kadhe S.
    Mazumdar A.
    Ramchandran K.
    IEEE Journal on Selected Areas in Information Theory, 2021, 2 (03): : 942 - 953
  • [35] FedBoost: Communication-Efficient Algorithms for Federated Learning
    Hamer, Jenny
    Mohri, Mehryar
    Suresh, Ananda Theertha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [36] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886
  • [37] Ternary Compression for Communication-Efficient Federated Learning
    Xu, Jinjin
    Du, Wenli
    Jin, Yaochu
    He, Wangli
    Cheng, Ran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1162 - 1176
  • [38] FedQMIX: Communication-efficient federated learning via multi-agent reinforcement learning
    Cao, Shaohua
    Zhang, Hanqing
    Wen, Tian
    Zhao, Hongwei
    Zheng, Quancheng
    Zhang, Weishan
    Zheng, Danyang
    HIGH-CONFIDENCE COMPUTING, 2024, 4 (02):
  • [39] LEARN: Selecting Samples Without Training Verification for Communication-Efficient Vertical Federated Learning
    Liu, Tong
    Lyu, Feng
    Ma, Jun
    Deng, Yongheng
    Chen, Jiayin
    Tan, Qilong
    Zhang, Yaoxue
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1217 - 1222
  • [40] Graph-Based Traffic Forecasting via Communication-Efficient Federated Learning
    Zhang, Chenhan
    Zhang, Shiyao
    Yu, Shui
    Yu, James J. Q.
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 2041 - 2046