Wireless Distributed Learning: A New Hybrid Split and Federated Learning Approach

被引:31
|
作者
Liu, Xiaolan [1 ]
Deng, Yansha [2 ]
Mahmoodi, Toktam [2 ]
机构
[1] Loughborough Univ, Inst Digital Technol, London E20 3BS, England
[2] Kings Coll London, Dept Engn, London WC2R 2LS, England
基金
英国工程与自然科学研究理事会;
关键词
Computational modeling; Wireless communication; Training; Autonomous aerial vehicles; Distance learning; Computer aided instruction; Data models; Wireless unmanned aerial vehicles (UAV) Networks; Federated learning (FL); Multi-Arm Bandit (MAB); Split learning (SL); User (UE) selection; UAVS;
D O I
10.1109/TWC.2022.3213411
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Cellular-connected unmanned aerial vehicle (UAV) with flexible deployment is foreseen to be a major part of the sixth generation (6G) networks. The UAVs connected to the base station (BS), as aerial users (UEs), could exploit machine learning (ML) algorithms to provide a wide range of advanced applications, like object detection and video tracking. Conventionally, the ML model training is performed at the BS, known as centralized learning (CL), which causes high communication overhead due to the transmission of large datasets, and potential concerns about UE privacy. To address this, distributed learning algorithms, including federated learning (FL) and split learning (SL), were proposed to train the ML models in a distributed manner via only sharing model parameters. FL requires higher computational resource on the UE side than SL, while SL has larger communication overhead when the local dataset is large. To effectively train an ML model considering the diversity of UEs with different computational capabilities and channel conditions, we first propose a novel distributed learning architecture, a hybrid split and federated learning (HSFL) algorithm by reaping the parallel model training mechanism of FL and the model splitting structure of SL. We then provide its convergence analysis under non-independent and identically distributed (non-IID) data with random UE selection scheme. By conducting experiments on training two ML models, Net and AlexNet, in wireless UAV networks, our results demonstrate that the HSFL algorithm achieves higher learning accuracy than FL and less communication overhead than SL under IID and non-IID data, and the learning accuracy of HSFL algorithm increases with the increasing number of the split training UEs. We further propose a Multi-Arm Bandit (MAB) based best channel (BC) and best 2-norm (BN2) (MAB-BC-BN2) UE selection scheme to select the UEs with better wireless channel quality and larger local model updates for model training in each round. Numerical results demonstrate it achieves higher learning accuracy than BC, MAB-BC and MAB-BN2 UE selection scheme under non-IID, Dirichlet-nonIID and Dirichlet-Imbalanced data.
引用
收藏
页码:2650 / 2665
页数:16
相关论文
共 50 条
  • [1] Split Consensus Federated Learning: An Approach for Distributed Training and Inference
    Tedeschini, Bernardo Camajori
    Brambilla, Mattia
    Nicoli, Monica
    IEEE ACCESS, 2024, 12 : 119535 - 119549
  • [2] A Novel Hybrid Split and Federated Learning Architecture in Wireless UAV Networks
    Liu, Xiaolan
    Deng, Yansha
    Mahmoodi, Toktam
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022,
  • [3] Federated or Split? A Performance and Privacy Analysis of Hybrid Split and Federated Learning Architectures
    Turina, Valeria
    Zhang, Zongshun
    Esposito, Flavio
    Matta, Ibrahim
    2021 IEEE 14TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING (CLOUD 2021), 2021, : 250 - 260
  • [4] Energy Efficient User Scheduling for Hybrid Split and Federated Learning in Wireless UAV Networks
    Liu, Xiaolan
    Deng, Yansha
    Mahmoodi, Toktam
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022,
  • [5] Effectively Heterogeneous Federated Learning: A Pairing and Split Learning Based Approach
    Shen, Jinglong
    Wang, Xiucheng
    Cheng, Nan
    Ma, Longfei
    Zhou, Conghao
    Zhang, Yuan
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5847 - 5852
  • [6] CHEESE: Distributed Clustering-Based Hybrid Federated Split Learning Over Edge Networks
    Cheng, Zhipeng
    Xia, Xiaoyu
    Liwang, Minghui
    Fan, Xuwei
    Sun, Yanglong
    Wang, Xianbin
    Huang, Lianfen
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2023, 34 (12) : 3174 - 3191
  • [7] Resource Optimized Hierarchical Split Federated Learning for Wireless Networks
    Khan, Latif U.
    Guizani, Mohsen
    Hong, Choong Seon
    2023 CYBER-PHYSICAL SYSTEMS AND INTERNET-OF-THINGS WEEK, CPS-IOT WEEK WORKSHOPS, 2023, : 254 - 259
  • [8] Accelerating Split Federated Learning Over Wireless Communication Networks
    Xu, Ce
    Li, Jinxuan
    Liu, Yuan
    Ling, Yushi
    Wen, Miaowen
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (06) : 5587 - 5599
  • [9] Federated Learning Approach for Distributed Ransomware Analysis
    Vehabovic, Aldin
    Zanddizari, Hadi
    Shaikh, Farook
    Ghani, Nasir
    Pour, Morteza Safaei
    Bou-Harb, Elias
    Crichigno, Jorge
    APPLIED CRYPTOGRAPHY AND NETWORK SECURITY WORKSHOPS, ACNS 2023 SATELLITE WORKSHOPS, ADSC 2023, AIBLOCK 2023, AIHWS 2023, AIOTS 2023, CIMSS 2023, CLOUD S&P 2023, SCI 2023, SECMT 2023, SIMLA 2023, 2023, 13907 : 621 - 641
  • [10] A Distributed Aggregation Approach for Vehicular Federated Learning
    Pacheco, Lucas
    Braun, Torsten
    Rosario, Denis
    Di Maio, Antonio
    Cerqueira, Eduardo
    IEEE ACCESS, 2024, 12 : 72155 - 72169