Efficient Parallel Split Learning Over Resource-Constrained Wireless Edge Networks

被引:19
|
作者
Lin, Zheng [1 ]
Zhu, Guangyu [2 ]
Deng, Yiqin [3 ]
Chen, Xianhao [1 ]
Gao, Yue [4 ]
Huang, Kaibin [1 ]
Fang, Yuguang [5 ]
机构
[1] Univ Hong Kong, Dept Elect & Elect Engn, Hong Kong, Peoples R China
[2] Univ Florida, Dept Elect & Comp Engn, Gainesville, FL 32611 USA
[3] Shandong Univ, Sch Control Sci & Engn, Jinan 250061, Peoples R China
[4] Fudan Univ, Sch Comp Sci, Shanghai 200438, Peoples R China
[5] City Univ Hong Kong, Dept Comp Sci, Hong Kong, Peoples R China
关键词
Computational modeling; Training; Resource management; Servers; Data models; Internet of Things; Optimization; Distributed learning; edge computing; edge intelligence; resource management; split learning; CONVERGENCE;
D O I
10.1109/TMC.2024.3359040
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The increasingly deeper neural networks hinder the democratization of privacy-enhancing distributed learning, such as federated learning (FL), to resource-constrained devices. To overcome this challenge, in this paper, we advocate the integration of edge computing paradigm and parallel split learning (PSL), allowing multiple edge devices to offload substantial training workloads to an edge server via layer-wise model split. By observing that existing PSL schemes incur excessive training latency and a large volume of data transmissions, we propose an innovative PSL framework, namely, efficient parallel split learning (EPSL), to accelerate model training. To be specific, EPSL parallelizes client-side model training and reduces the dimension of activations' gradients for backpropagation (BP) via last-layer gradient aggregation, leading to a significant reduction in server-side training and communication latency. Moreover, by considering the heterogeneous channel conditions and computing capabilities at edge devices, we jointly optimize subchannel allocation, power control, and cut layer selection to minimize the per-round latency. Simulation results show that the proposed EPSL framework significantly decreases the training latency needed to achieve a target accuracy compared with the state-of-the-art benchmarks, and the tailored resource management and layer split strategy can considerably reduce latency than the counterpart without optimization.
引用
收藏
页码:9224 / 9239
页数:16
相关论文
共 50 条
  • [41] AdaptiveMesh: Adaptive Federated Learning for Resource-Constrained Wireless Environments
    Shkurti, Lamir
    Selimi, Mennan
    INTERNATIONAL JOURNAL OF ONLINE AND BIOMEDICAL ENGINEERING, 2024, 20 (14) : 22 - 37
  • [42] Resource Management and Fairness for Federated Learning over Wireless Edge Networks
    Balakrishnan, Ravikumar
    Akdeniz, Mustafa
    Dhakal, Sagar
    Himayat, Nageen
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [43] Efficient Resource-Constrained Monitoring
    Moraney, Jalil
    Raz, Danny
    PROCEEDINGS OF THE IEEE/IFIP NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM 2022, 2022,
  • [44] Exploration of Balanced Design in Resource-Constrained Edge Device for Efficient CNNs
    Wang, Xiaotian
    Tian, Teng
    Zhao, Letian
    Wu, Wei
    Jin, Xi
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2022, 69 (11) : 4573 - 4577
  • [45] Performance Analysis for Resource Constrained Decentralized Federated Learning Over Wireless Networks
    Yan, Zhigang
    Li, Dong
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2024, 72 (07) : 4084 - 4100
  • [46] Efficient Adaptive Federated Learning in Resource-Constrained IoT Environments
    Chen, Zunming
    Cui, Hongyan
    Luan, Qiuji
    Xi, Yu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1896 - 1901
  • [47] Complex Event Detection in Extremely Resource-Constrained Wireless Sensor Networks
    Zoumboulakis, Michael
    Roussos, George
    MOBILE NETWORKS & APPLICATIONS, 2011, 16 (02): : 194 - 213
  • [48] Complex Event Detection in Extremely Resource-Constrained Wireless Sensor Networks
    Michael Zoumboulakis
    George Roussos
    Mobile Networks and Applications, 2011, 16 : 194 - 213
  • [49] Resource-Constrained Federated Edge Learning With Heterogeneous Data: Formulation and Analysis
    Liu, Yi
    Zhu, Yuanshao
    Yu, James J. Q.
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (05): : 3166 - 3178
  • [50] Fully Distributed Deep Learning Inference on Resource-Constrained Edge Devices
    Stahl, Rafael
    Zhao, Zhuoran
    Mueller-Gritschneder, Daniel
    Gerstlauer, Andreas
    Schlichtmann, Ulf
    EMBEDDED COMPUTER SYSTEMS: ARCHITECTURES, MODELING, AND SIMULATION, SAMOS 2019, 2019, 11733 : 77 - 90