Semi-Decentralized Federated Learning With Cooperative D2D Local Model Aggregations

被引:51
|
作者
Lin, Frank Po-Chen [1 ]
Hosseinalipour, Seyyedali [1 ]
Azam, Sheikh Shams [1 ]
Brinton, Christopher G. [1 ]
Michelusi, Nicolo [2 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
[2] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85281 USA
基金
美国国家科学基金会;
关键词
Device-to-device (D2D) communications; peer-to-peer (P2P) learning; fog learning; cooperative consensus formation; semi-decentralized federated learning; CONVERGENCE; DEVICE; COMMUNICATION; OPTIMIZATION; ALLOCATION; DESIGN; AWARE; POWER;
D O I
10.1109/JSAC.2021.3118344
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning has emerged as a popular technique for distributing machine learning (ML) model training across the wireless edge. In this paper, we propose two timescale hybrid federated learning (TT-HF), a semi-decentralized learning architecture that combines the conventional device-to-server communication paradigm for federated learning with device-to-device (D2D) communications for model training. In TT-HF, during each global aggregation interval, devices (i) perform multiple stochastic gradient descent iterations on their individual datasets, and (ii) aperiodically engage in consensus procedure of their model parameters through cooperative, distributed D2D communications within local clusters. With a new general definition of gradient diversity, we formally study the convergence behavior of TT-HF, resulting in new convergence bounds for distributed ML. We leverage our convergence bounds to develop an adaptive control algorithm that tunes the step size, D2D communication rounds, and global aggregation period of TT-HF over time to target a sublinear convergence rate of O(1/t) while minimizing network resource utilization. Our subsequent experiments demonstrate that TT-HF significantly outperforms the current art in federated learning in terms of model accuracy and/or network energy consumption in different scenarios where local device datasets exhibit statistical heterogeneity. Finally, our numerical evaluations demonstrate robustness against outages caused by fading channels, as well favorable performance with non-convex loss functions.
引用
收藏
页码:3851 / 3869
页数:19
相关论文
共 50 条
  • [1] Connectivity-Aware Semi-Decentralized Federated Learning over Time-Varying D2D Networks
    Parasnis, Rohit
    Hosseinalipour, Seyyedali
    Chu, Yun-Wei
    Chiang, Mung
    Brinton, Christopher G.
    [J]. PROCEEDINGS OF THE 2023 INTERNATIONAL SYMPOSIUM ON THEORY, ALGORITHMIC FOUNDATIONS, AND PROTOCOL DESIGN FOR MOBILE NETWORKS AND MOBILE COMPUTING, MOBIHOC 2023, 2023, : 31 - 40
  • [2] Topology Learning for Heterogeneous Decentralized Federated Learning Over Unreliable D2D Networks
    Wu, Zheshun
    Xu, Zenglin
    Zeng, Dun
    Li, Junfan
    Liu, Jie
    [J]. IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (08) : 12201 - 12206
  • [3] Minimizing Energy Consumption for Decentralized Federated Learning Using D2D Communications
    Al-Abiad, Mohammed S.
    Hossain, M. J.
    [J]. 2023 IEEE 97TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2023-SPRING, 2023,
  • [4] Decentralized Federated Learning via SGD over Wireless D2D Networks
    Xing, Hong
    Simeone, Osvaldo
    Bi, Suzhi
    [J]. PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [5] Asynchronous Semi-Decentralized Federated Edge Learning for Heterogeneous Clients
    Sun, Yuchang
    Shao, Jiawei
    Mao, Yuyi
    Zhang, Jun
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 5196 - 5201
  • [6] Semi-Decentralized Federated Edge Learning With Data and Device Heterogeneity
    Sun, Yuchang
    Shao, Jiawei
    Mao, Yuyi
    Wang, Jessie Hui
    Zhang, Jun
    [J]. IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2023, 20 (02): : 1487 - 1501
  • [7] Robust Semi-Decentralized Federated Learning via Collaborative Relaying
    Yemini, Michal
    Saha, Rajarshi
    Ozfatura, Emre
    Gunduz, Deniz
    Goldsmith, Andrea J.
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (07) : 7520 - 7536
  • [8] Federated Learning Beyond the Star: Local D2D Model Consensus with Global Cluster Sampling
    Lin, Frank Po-Chen
    Hosseinalipour, Seyyedali
    Azam, Sheikh Shams
    Brinton, Christopher G.
    Michelusi, Nicolo
    [J]. 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [9] Decentralized Aggregation for Energy-Efficient Federated Learning via D2D Communications
    Al-Abiad, Mohammed S.
    Obeed, Mohanad
    Hossain, Md. Jahangir
    Chaaban, Anas
    [J]. IEEE TRANSACTIONS ON COMMUNICATIONS, 2023, 71 (06) : 3333 - 3351
  • [10] Edge aggregation placement for semi-decentralized federated learning in Industrial Internet of Things
    Xu, Bo
    Zhao, Haitao
    Cao, Haotong
    Garg, Sahil
    Kaddoum, Georges
    Hassan, Mohammad Mehedi
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2024, 150 : 160 - 170