Communication-Efficient Vertical Federated Learning via Compressed Error Feedback

被引:0
|
作者
Valdeira, Pedro [1 ,2 ,3 ]
Xavier, Joao [2 ]
Soares, Claudia [4 ]
Chi, Yuejie [1 ]
机构
[1] Carnegie Mellon Univ, Dept Elect & Comp Engn, Pittsburgh, PA 15213 USA
[2] Univ Lisbon, Inst Super Tecn, P-1049001 Lisbon, Portugal
[3] Inst Syst & Robot, Lab Robot & Engn Syst, P-1600011 Lisbon, Portugal
[4] Univ Nova Lisboa, NOVA Sch Sci & Technol, Dept Comp Sci, P-2829516 Caparica, Portugal
基金
美国国家科学基金会;
关键词
Servers; Compressors; Training; Convergence; Vectors; Federated learning; Receivers; Optimization methods; Electronic mail; Data models; Vertical federated learning; nonconvex optimization; communication-compressed optimization; QUANTIZATION;
D O I
10.1109/TSP.2025.3540655
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Communication overhead is a known bottleneck in federated learning (FL). To address this, lossy compression is commonly used on the information communicated between the server and clients during training. In horizontal FL, where each client holds a subset of the samples, such communication-compressed training methods have recently seen significant progress. However, in their vertical FL counterparts, where each client holds a subset of the features, our understanding remains limited. To address this, we propose an error feedback compressed vertical federated learning (EF-VFL) method to train split neural networks. In contrast to previous communication-compressed methods for vertical FL, EF-VFL does not require a vanishing compression error for the gradient norm to converge to zero for smooth nonconvex problems. By leveraging error feedback, our method can achieve a O(1/T) convergence rate for a sufficiently large batch size, improving over the state-of-the-art O(1/root T) rate under O(1/root T) compression error, and matching the rate of uncompressed methods. Further, when the objective function satisfies the Polyak-& Lstrok;ojasiewicz inequality, our method converges linearly. In addition to improving convergence, our method also supports the use of private labels. Numerical experiments show that EF-VFL significantly improves over the prior art, confirming our theoretical results.
引用
收藏
页码:1065 / 1080
页数:16
相关论文
共 50 条
  • [1] Communication-efficient Vertical Federated Learning via Compressed Error Feedback
    Valdeira, Pedro
    Xavier, Joao
    Soares, Claudia
    Chi, Yuejie
    32ND EUROPEAN SIGNAL PROCESSING CONFERENCE, EUSIPCO 2024, 2024, : 1037 - 1041
  • [2] Communication-Efficient Federated Learning via Quantized Compressed Sensing
    Oh, Yongjeong
    Lee, Namyoon
    Jeon, Yo-Seb
    Poor, H. Vincent
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (02) : 1087 - 1100
  • [3] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    ALGORITHMS, 2022, 15 (08)
  • [4] Communication-Efficient Nonconvex Federated Learning With Error Feedback for Uplink and Downlink
    Zhou, Xingcai
    Chang, Le
    Cao, Jinde
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1003 - 1014
  • [5] Communication-Efficient Nonconvex Federated Learning With Error Feedback for Uplink and Downlink
    Zhou, Xingcai
    Chang, Le
    Cao, Jinde
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1003 - 1014
  • [6] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541
  • [7] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [8] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [9] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [10] Communication-Efficient Vertical Federated Learning with Limited Overlapping Samples
    Sun, Jingwei
    Xu, Ziyue
    Yang, Dong
    Nath, Vishwesh
    Li, Wenqi
    Zhao, Can
    Xu, Daguang
    Chen, Yiran
    Roth, Holger R.
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 5180 - 5189