HADFL: Heterogeneity-aware Decentralized Federated Learning Framework

被引:13
|
作者
Cao, Jing [1 ]
Lian, Zirui [1 ]
Liu, Weihong [1 ]
Zhu, Zongwei [1 ]
Ji, Cheng [2 ]
机构
[1] Univ Sci & Technol China, Hefei, Anhui, Peoples R China
[2] Nanjing Univ Sci & Technol, Nanjing, Peoples R China
基金
中国博士后科学基金;
关键词
Distributed Training; Machine Learning; Federated Learning; Heterogeneous Computing;
D O I
10.1109/DAC18074.2021.9586101
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) supports training models on geographically distributed devices. However, traditional FL systems adopt a centralized synchronous strategy, putting high communication pressure and model generalization challenge. Existing optimizations on FL either fail to speedup training on heterogeneous devices or suffer from poor communication efficiency. In this paper, we propose HADFL, a framework that supports decentralized asynchronous training on heterogeneous devices. The devices train model locally with heterogeneity-aware local steps using local data. In each aggregation cycle, they are selected based on probability to perform model synchronization and aggregation. Compared with the traditional FL system, HADFL can relieve the central server's communication pressure, efficiently utilize heterogeneous computing power, and can achieve a maximum speedup of 3.15x than decentralized-FedAvg and 4.68x than Pytorch distributed training scheme, respectively, with almost no loss of convergence accuracy.
引用
收藏
页码:1 / 6
页数:6
相关论文
共 50 条
  • [41] Heterogeneity-Aware Data Placement in Hybrid Clouds
    Marquez, Jack D.
    Gonzalez, Juan D.
    Mondragon, Oscar H.
    CLOUD COMPUTING - CLOUD 2019, 2019, 11513 : 177 - 191
  • [42] A Trustless Federated Framework for Decentralized and Confidential Deep Learning
    Li, Chao
    Shen, Qiuyu
    Xiang, Cheng
    Ramesh, Bharath
    2022 IEEE 1ST GLOBAL EMERGING TECHNOLOGY BLOCKCHAIN FORUM: BLOCKCHAIN & BEYOND, IGETBLOCKCHAIN, 2022,
  • [43] DeceFL: a principled fully decentralized federated learning framework
    Ye Yuan
    Jun Liu
    Dou Jin
    Zuogong Yue
    Tao Yang
    Ruijuan Chen
    Maolin Wang
    Chuan Sun
    Lei Xu
    Feng Hua
    Yuqi Guo
    Xiuchuan Tang
    Xin He
    Xinlei Yi
    Dong Li
    Guanghui Wen
    Wenwu Yu
    Hai-Tao Zhang
    Tianyou Chai
    Shaochun Sui
    Han Ding
    National Science Open, 2023, 2 (01) : 38 - 54
  • [44] Joint heterogeneity-aware personalized federated search for energy efficient battery-powered edge computing
    Yang, Zhao
    Zhang, Shengbing
    Li, Chuxi
    Wang, Miao
    Yang, Jiaying
    Zhang, Meng
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2023, 146 : 178 - 194
  • [45] HaRD: a heterogeneity-aware replica deletion for HDFS
    Hilmi Egemen Ciritoglu
    John Murphy
    Christina Thorpe
    Journal of Big Data, 6
  • [46] Heterogeneity-aware Clustered Distributed Learning for Multi-source Data Analysis
    Chen, Yuanxing
    Zhang, Qingzhao
    Ma, Shuangge
    Fang, Kuangnan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [47] HaRD: a heterogeneity-aware replica deletion for HDFS
    Ciritoglu, Hilmi Egemen
    Murphy, John
    Thorpe, Christina
    JOURNAL OF BIG DATA, 2019, 6 (01)
  • [48] Heterogeneity-Aware Resource Allocation in HPC Systems
    Netti, Alessio
    Galleguillos, Cristian
    Kiziltan, Zeynep
    Sirbu, Alina
    Babaoglu, Ozalp
    HIGH PERFORMANCE COMPUTING, ISC HIGH PERFORMANCE 2018, 2018, 10876 : 3 - 21
  • [49] Heterogeneity-aware Multicore Synchronization for Intermittent Systems
    Chen, Wei-Ming
    Kuo, Tei-Wei
    Hsiu, Pi-Cheng
    ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2021, 20 (05)
  • [50] Heterogeneity-aware Gradient Coding for Straggler Tolerance
    Wang, Haozhao
    Guo, Song
    Tang, Bin
    Li, Ruixuan
    Li, Chengjie
    2019 39TH IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2019), 2019, : 555 - 564