HADFL: Heterogeneity-aware Decentralized Federated Learning Framework

被引:13
|
作者
Cao, Jing [1 ]
Lian, Zirui [1 ]
Liu, Weihong [1 ]
Zhu, Zongwei [1 ]
Ji, Cheng [2 ]
机构
[1] Univ Sci & Technol China, Hefei, Anhui, Peoples R China
[2] Nanjing Univ Sci & Technol, Nanjing, Peoples R China
基金
中国博士后科学基金;
关键词
Distributed Training; Machine Learning; Federated Learning; Heterogeneous Computing;
D O I
10.1109/DAC18074.2021.9586101
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) supports training models on geographically distributed devices. However, traditional FL systems adopt a centralized synchronous strategy, putting high communication pressure and model generalization challenge. Existing optimizations on FL either fail to speedup training on heterogeneous devices or suffer from poor communication efficiency. In this paper, we propose HADFL, a framework that supports decentralized asynchronous training on heterogeneous devices. The devices train model locally with heterogeneity-aware local steps using local data. In each aggregation cycle, they are selected based on probability to perform model synchronization and aggregation. Compared with the traditional FL system, HADFL can relieve the central server's communication pressure, efficiently utilize heterogeneous computing power, and can achieve a maximum speedup of 3.15x than decentralized-FedAvg and 4.68x than Pytorch distributed training scheme, respectively, with almost no loss of convergence accuracy.
引用
收藏
页码:1 / 6
页数:6
相关论文
共 50 条
  • [1] Heterogeneity-aware fair federated learning
    Li, Xiaoli
    Zhao, Siran
    Chen, Chuan
    Zheng, Zibin
    INFORMATION SCIENCES, 2023, 619 : 968 - 986
  • [2] FLASH: Heterogeneity-Aware Federated Learning at Scale
    Yang, Chengxu
    Xu, Mengwei
    Wang, Qipeng
    Chen, Zhenpeng
    Huang, Kang
    Ma, Yun
    Bian, Kaigui
    Huang, Gang
    Liu, Yunxin
    Jin, Xin
    Liu, Xuanzhe
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (01) : 483 - 500
  • [3] Hop: Heterogeneity-aware Decentralized Training
    Luo, Qinyi
    Lin, Jinkun
    Zhuo, Youwei
    Qian, Xuehai
    TWENTY-FOURTH INTERNATIONAL CONFERENCE ON ARCHITECTURAL SUPPORT FOR PROGRAMMING LANGUAGES AND OPERATING SYSTEMS (ASPLOS XXIV), 2019, : 893 - 907
  • [4] AutoFL: Enabling Heterogeneity-Aware Energy Efficient Federated Learning
    Kim, Young Geun
    Wu, Carole-Jean
    PROCEEDINGS OF 54TH ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE, MICRO 2021, 2021, : 183 - 198
  • [5] HARMONY: Heterogeneity-Aware Hierarchical Management for Federated Learning System
    Tian, Chunlin
    Li, Li
    Shi, Zhan
    Wang, Jun
    Xu, ChengZhong
    2022 55TH ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE (MICRO), 2022, : 631 - 645
  • [6] Federated Learning With Heterogeneity-Aware Probabilistic Synchronous Parallel on Edge
    Zhao, Jianxin
    Han, Rui
    Yang, Yongkai
    Catterall, Benjamin
    Liu, Chi Harold
    Chen, Lydia Y.
    Mortier, Richard
    Crowcroft, Jon
    Wang, Liang
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2022, 15 (02) : 614 - 626
  • [7] Heterogeneity-aware device selection for efficient federated edge learning
    Shi, Yiran
    Nie, Jieyan
    Li, Xingwei
    Li, Hui
    International Journal of Intelligent Networks, 2024, 5 : 293 - 301
  • [8] Resource and Heterogeneity-aware Clients Eligibility Protocol in Federated Learning
    Asad, Muhammad
    Otoum, Safa
    Shaukat, Saima
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1140 - 1145
  • [9] Helios: Heterogeneity-Aware Federated Learning with Dynamically Balanced Collaboration
    Xu, Zirui
    Yu, Fuxun
    Xiong, Jinjun
    Chen, Xiang
    2021 58TH ACM/IEEE DESIGN AUTOMATION CONFERENCE (DAC), 2021, : 997 - 1002
  • [10] Heterogeneity-aware device selection for clustered federated learning in IoT
    Zhang, Hongxia
    Li, Zeya
    Xi, Shiyu
    Zhao, Xiangxu
    Liu, Jianhang
    Zhang, Peiying
    Peer-to-Peer Networking and Applications, 2025, 18 (01) : 1 - 17