A Communication-Efficient Hierarchical Federated Learning Framework via Shaping Data Distribution at Edge

被引:7
|
作者
Deng, Yongheng [1 ]
Lyu, Feng [2 ]
Xia, Tengxi [1 ]
Zhou, Yuezhi [3 ]
Zhang, Yaoxue [1 ,3 ]
Ren, Ju [1 ,3 ]
Yang, Yuanyuan [4 ]
机构
[1] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol BNRist, Dept Comp Sci & Technol, Beijing 100084, Peoples R China
[2] Cent South Univ, Sch Comp Sci & Engn, Changsha 410083, Peoples R China
[3] Zhongguancun Lab, Beijing 100084, Peoples R China
[4] SUNY Stony Brook, Dept Elect & Comp Engn, Stony Brook, NY 11794 USA
关键词
Costs; Data models; Servers; Computational modeling; Training data; Federated learning; Distributed databases; Hierarchical federated learning; communication efficiency; edge computing; distributed edge intelligence; RESOURCE-ALLOCATION;
D O I
10.1109/TNET.2024.3363916
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) enables collaborative model training over distributed computing nodes without sharing their privacy-sensitive raw data. However, in FL, iterative exchanges of model updates between distributed nodes and the cloud server can result in significant communication cost, especially when the data distributions at distributed nodes are imbalanced with requiring more rounds of iterations. In this paper, with our in-depth empirical studies, we disclose that extensive cloud aggregations can be avoided without compromising the learning accuracy if frequent aggregations can be enabled at edge network. To this end, we shed light on the hierarchical federated learning (HFL) framework, where a subset of distributed nodes can play as edge aggregators to support edge aggregations. Under the HFL framework, we formulate a communication cost minimization (CCM) problem to minimize the total communication cost required for model learning with a target accuracy by making decisions on edge aggragator selection and node-edge associations. Inspired by our data-driven insights that the potential of HFL lies in the data distribution at edge aggregators, we propose ShapeFL, i.e., SHaping dAta distRibution at Edge, to transform and solve the CCM problem. In ShapeFL, we divide the original problem into two sub-problems to minimize the per-round communication cost and maximize the data distribution diversity of edge aggregator data, respectively, and devise two light-weight algorithms to solve them accordingly. Extensive experiments are carried out based on several opened datasets and real-world network topologies, and the results demonstrate the efficacy of ShapeFL in terms of both learning accuracy and communication efficiency.
引用
收藏
页码:2600 / 2615
页数:16
相关论文
共 50 条
  • [21] PBFL: Communication-Efficient Federated Learning via Parameter Predicting
    Li, Kaiju
    Xiao, Chunhua
    COMPUTER JOURNAL, 2023, 66 (03): : 626 - 642
  • [22] LotteryFL: Empower Edge Intelligence with Personalized and Communication-Efficient Federated Learning
    Li, Ang
    Sun, Jingwei
    Wang, Binghui
    Duan, Lin
    Li, Sicheng
    Chen, Yiran
    Li, Hai
    2021 ACM/IEEE 6TH SYMPOSIUM ON EDGE COMPUTING (SEC 2021), 2021, : 68 - 79
  • [23] AGQFL: Communication-efficient Federated Learning via Automatic Gradient Quantization in Edge Heterogeneous Systems
    Lian, Zirui
    Cao, Jing
    Zuo, Yanru
    Liu, Weihong
    Zhu, Zongwei
    2021 IEEE 39TH INTERNATIONAL CONFERENCE ON COMPUTER DESIGN (ICCD 2021), 2021, : 551 - 558
  • [24] FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning
    Xiong, Yuanhao
    Wang, Ruochen
    Cheng, Minhao
    Yu, Felix
    Hsieh, Cho-Jui
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 16323 - 16332
  • [25] Toward Communication-Efficient Federated Learning in the Internet of Things With Edge Computing
    Sun, Haifeng
    Li, Shiqi
    Yu, F. Richard
    Qi, Qi
    Wang, Jingyu
    Liao, Jianxin
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (11) : 11053 - 11067
  • [26] Communication-efficient Federated Learning via Quantized Clipped SGD
    Jia, Ninghui
    Qu, Zhihao
    Ye, Baoliu
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT I, 2021, 12937 : 559 - 571
  • [27] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    ALGORITHMS, 2022, 15 (08)
  • [28] Communication-Efficient Personalized Federated Meta-Learning in Edge Networks
    Yu, Feng
    Lin, Hui
    Wang, Xiaoding
    Garg, Sahil
    Kaddoum, Georges
    Singh, Satinder
    Hassan, Mohammad Mehedi
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (02): : 1558 - 1571
  • [29] Communication-Efficient Personalized Federated Edge Learning for Decentralized Sensing in ISAC
    Zhu, Yonghui
    Zhang, Ronghui
    Cui, Yuanhao
    Wu, Sheng
    Jiang, Chunxiao
    Jing, Xiaojun
    2023 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS, 2023, : 207 - 212
  • [30] Communication-Efficient Federated Learning via Quantized Compressed Sensing
    Oh, Yongjeong
    Lee, Namyoon
    Jeon, Yo-Seb
    Poor, H. Vincent
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (02) : 1087 - 1100