HADFL: Heterogeneity-aware Decentralized Federated Learning Framework

被引:13
|
作者
Cao, Jing [1 ]
Lian, Zirui [1 ]
Liu, Weihong [1 ]
Zhu, Zongwei [1 ]
Ji, Cheng [2 ]
机构
[1] Univ Sci & Technol China, Hefei, Anhui, Peoples R China
[2] Nanjing Univ Sci & Technol, Nanjing, Peoples R China
基金
中国博士后科学基金;
关键词
Distributed Training; Machine Learning; Federated Learning; Heterogeneous Computing;
D O I
10.1109/DAC18074.2021.9586101
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) supports training models on geographically distributed devices. However, traditional FL systems adopt a centralized synchronous strategy, putting high communication pressure and model generalization challenge. Existing optimizations on FL either fail to speedup training on heterogeneous devices or suffer from poor communication efficiency. In this paper, we propose HADFL, a framework that supports decentralized asynchronous training on heterogeneous devices. The devices train model locally with heterogeneity-aware local steps using local data. In each aggregation cycle, they are selected based on probability to perform model synchronization and aggregation. Compared with the traditional FL system, HADFL can relieve the central server's communication pressure, efficiently utilize heterogeneous computing power, and can achieve a maximum speedup of 3.15x than decentralized-FedAvg and 4.68x than Pytorch distributed training scheme, respectively, with almost no loss of convergence accuracy.
引用
收藏
页码:1 / 6
页数:6
相关论文
共 50 条
  • [31] BACombo-Bandwidth-Aware Decentralized Federated Learning
    Jiang, Jingyan
    Hu, Liang
    Hu, Chenghao
    Liu, Jiate
    Wang, Zhi
    ELECTRONICS, 2020, 9 (03)
  • [32] Graph Federated Learning Based on the Decentralized Framework
    Liu, Peilin
    Tang, Yanni
    Zhang, Mingyue
    Chen, Wu
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 452 - 463
  • [33] A decentralized data evaluation framework in federated learning
    Bhatia, Laveen
    Samet, Saeed
    BLOCKCHAIN-RESEARCH AND APPLICATIONS, 2023, 4 (04):
  • [34] A Heterogeneity-Aware Task Scheduler for Spark
    Xu, Luna
    Butt, Ali R.
    Lim, Seung-Hwan
    Kannan, Ramakrishnan
    2018 IEEE INTERNATIONAL CONFERENCE ON CLUSTER COMPUTING (CLUSTER), 2018, : 245 - 256
  • [35] Petrel: Heterogeneity-Aware Distributed Deep Learning Via Hybrid Synchronization
    Zhou, Qihua
    Guo, Song
    Qu, Zhihao
    Li, Peng
    Li, Li
    Guo, Minyi
    Wang, Kun
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2021, 32 (05) : 1030 - 1043
  • [36] Heterogeneity-aware Distributed Parameter Servers
    Jiang, Jiawei
    Cui, Bin
    Zhang, Ce
    Yu, Lele
    SIGMOD'17: PROCEEDINGS OF THE 2017 ACM INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2017, : 463 - 478
  • [37] HiFlash: Communication-Efficient Hierarchical Federated Learning With Adaptive Staleness Control and Heterogeneity-Aware Client-Edge Association
    Wu, Qiong
    Chen, Xu
    Ouyang, Tao
    Zhou, Zhi
    Zhang, Xiaoxi
    Yang, Shusen
    Zhang, Junshan
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (05) : 1560 - 1579
  • [38] Heterogeneity-Aware Distributed Machine Learning Training via Partial Reduce
    Miao, Xupeng
    Nie, Xiaonan
    Shao, Yingxia
    Yang, Zhi
    Jiang, Jiawei
    Ma, Lingxiao
    Cui, Bin
    SIGMOD '21: PROCEEDINGS OF THE 2021 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2021, : 2262 - 2270
  • [39] Heterogeneity-aware distributed access structure
    Beltrán, AG
    Milligan, P
    Sage, P
    FIFTH IEEE INTERNATIONAL CONFERENCE ON PEER-TO-PEER COMPUTING, PROCEEDINGS, 2005, : 152 - 153
  • [40] HALO: Heterogeneity-Aware Load Balancing
    Gandhi, Anshul
    Zhang, Xi
    Mittal, Naman
    2015 IEEE 23rd International Symposium on Modeling, Analysis, and Simulation of Computer and Telecommunication Systems (MASCOTS 2015), 2015, : 242 - 251