Fedcs: Efficient communication scheduling in decentralized federated learning

被引:4
|
作者
Zong, Ruixing [1 ]
Qin, Yunchuan [1 ]
Wu, Fan [1 ]
Tang, Zhuo [1 ,2 ]
Li, Kenli [1 ,2 ]
机构
[1] Hunan Univ, Changsha, Peoples R China
[2] Natl Supercomp Changsha Ctr, Changsha, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; Ring all reduce; Decentralized federated learning; Device placement; ATTACKS;
D O I
10.1016/j.inffus.2023.102028
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Decentralized federated learning is a training approach that prioritizes user data privacy protection, while also offering improved scalability and robustness. However, as the number of edge devices participating in training increases, a significant communication overhead arises among devices located in different geographical locations. Therefore, designing a well-thought-out gradient synchronization strategy is crucial for minimizing the overall communication overhead of training. To tackle this issue, this article introduces a 2D-Ring network structure based parameter synchronization strategy and a 2D-attention-based device placement algorithm, aiming to minimize communication overhead. The parameter synchronization strategy devises a two-layer circular communication architecture for the devices involved in training, thereby reducing the overall frequency of parameter synchronization in decentralized federated learning. By taking into account the total communication overhead and the device placement strategy, an optimization problem is formulated. Specifically, a 2D-attention neural network is constructed to optimize the device placement solution based on 2D-Ring network structure, leading to reduced communication overhead. Moreover, an evaluation model is designed to assess the communication overhead in a complex decentralized system during federated training. This enables precise determination of the total communication overhead throughout the training process, providing valuable insights for devising the device placement strategy. Extensive simulations confirm that the proposed approach achieves a substantial reductions of 55% and 64% in the total communication overhead for decentralized federated learning training with 50 and 100 devices, respectively.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] FedCS: Communication-Efficient Federated Learning with Compressive Sensing
    Liu, Ye
    Chang, Shan
    Liu, Yiqi
    2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 17 - 24
  • [2] Communication-Efficient Design for Quantized Decentralized Federated Learning
    Chen, Li
    Liu, Wei
    Chen, Yunfei
    Wang, Weidong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1175 - 1188
  • [3] Communication-efficient and Scalable Decentralized Federated Edge Learning
    Yapp, Austine Zong Han
    Koh, Hong Soo Nicholas
    Lai, Yan Ting
    Kang, Jiawen
    Li, Xuandi
    Ng, Jer Shyuan
    Jiang, Hongchao
    Lim, Wei Yang Bryan
    Xiong, Zehui
    Niyato, Dusit
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 5032 - 5035
  • [4] Communication Topologies for Decentralized Federated Learning
    Doetzer, Michael
    Mao, Yixin
    Diepold, Klaus
    2023 EIGHTH INTERNATIONAL CONFERENCE ON FOG AND MOBILE EDGE COMPUTING, FMEC, 2023, : 232 - 238
  • [5] Towards Efficient Decentralized Federated Learning
    Pappas, Christodoulos
    Papadopoulos, Dimitrios
    Chatzopoulos, Dimitris
    Panagou, Eleni
    Lalis, Spyros
    Vavalis, Manolis
    2022 IEEE 42ND INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS WORKSHOPS (ICDCSW), 2022, : 79 - 85
  • [6] A Layer Selection Optimizer for Communication-Efficient Decentralized Federated Deep Learning
    Barbieri, Luca
    Savazzi, Stefano
    Nicoli, Monica
    IEEE ACCESS, 2023, 11 : 22155 - 22173
  • [7] Communication Efficient Federated Learning via Ordered ADMM in a Fully Decentralized Setting
    Chen, Yicheng
    Blum, Rick S.
    Sadler, Brian M.
    2022 56TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2022, : 96 - 100
  • [8] Communication-Efficient Personalized Federated Edge Learning for Decentralized Sensing in ISAC
    Zhu, Yonghui
    Zhang, Ronghui
    Cui, Yuanhao
    Wu, Sheng
    Jiang, Chunxiao
    Jing, Xiaojun
    2023 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS, 2023, : 207 - 212
  • [9] Decentralized Federated Learning under Communication Delays
    Lee, Na
    Shan, Hangguan
    Song, Meiyan
    Zhou, Yong
    Zhao, Zhongyuan
    Li, Xinyu
    Zhang, Zhaoyang
    2022 IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING (SECON WORKSHOPS), 2022, : 37 - 42
  • [10] Clustered Scheduling and Communication Pipelining for Efficient Resource Management of Wireless Federated Learning
    Kececi, Cihat
    Shaqfeh, Mohammad
    Al-Qahtani, Fawaz
    Ismail, Muhammad
    Serpedin, Erchin
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (15) : 13303 - 13316