CRAS-FL: Clustered resource-aware scheme for federated learning in vehicular networks

被引:0
|
作者
Abdulrahman, Sawsan [1 ,2 ]
Bouachir, Ouns [1 ]
Otoum, Safa [1 ]
Mourad, Azzam [2 ,3 ]
机构
[1] Zayed Univ, Coll Technol Innovat, Dubai, U Arab Emirates
[2] Lebanese Amer Univ, Artificial Intelligence & Cyber Syst Res Ctr, Dept CSM, Beirut 11022801, Lebanon
[3] Khalifa Univ, KU Res Ctr 6G, Dept CS, Abu Dhabi, U Arab Emirates
关键词
Federated learning; Clustering; Vehicular-to-vehicular communication; Resource optimization; Computational offloading; Vehicular networks; SELECTION; MODEL;
D O I
10.1016/j.vehcom.2024.100769
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
As a promising distributed learning paradigm, Federated Learning (FL) is expected to meet the ever-increasing needs of Machine Learning (ML) based applications in Intelligent Transportation Systems (ITS). It is a powerful tool that processes the large amount of on-board data while preserving its privacy by locally learning the models. However, training and transmitting the model parameters in vehicular networks consume significant resources and time, which is not suitable for applications with strict real-time requirements. Moreover, the quality of the data, the mobility of the participating vehicles, as well as their heterogeneous capabilities, can impact the performance of FL process, bringing to the forefront the optimization of the data selection and the clients resources. In this paper, we propose CRAS-FL, a Clustered Resource-Aware Scheme for FL in Vehicular Networks. The proposed approach bypasses (1) communication bottlenecks by forming groups of vehicles, where the Cluster Head (CH) is responsible of handling the communication and (2) computation bottlenecks by introducing an offloading strategy, where the availability of the extra resources on some vehicles is leveraged. Particularly, CRAS-FL implements a CH election Algorithm, where the bandwidth, stability, computational resources, and vehicles topology are considered in order to ensure reliable communication and cluster stability. Moreover, the offloading strategy studies the quality of the models and the resources of the clients, and accordingly allows computational offloading among the group peers. The conducted experiments show how the proposed scheme outperforms the current approaches in the literature by (1) reducing the communication overhead, (2) targeting more training data, and (3) reducing the clusters response time.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Driving Towards Efficiency: Adaptive Resource-aware Clustered Federated Learning in Vehicular Networks
    Khalil, Ahmad
    Delouee, Majid Lotfian
    Degeler, Victoria
    Meuser, Tobias
    Anta, Antonio Fernandez
    Koldehofe, Boris
    2024 22ND MEDITERRANEAN COMMUNICATION AND COMPUTER NETWORKING CONFERENCE, MEDCOMNET 2024, 2024,
  • [2] Resource-Aware Hierarchical Federated Learning in Wireless Video Caching Networks
    Pervej, Md Ferdous
    Molisch, Andreas F.
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2025, 24 (01) : 165 - 180
  • [3] Resource-Aware Knowledge Distillation for Federated Learning
    Chen, Zheyi
    Tian, Pu
    Liao, Weixian
    Chen, Xuhui
    Xu, Guobin
    Yu, Wei
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTING, 2023, 11 (03) : 706 - 719
  • [4] FedTAR: Task and Resource-Aware Federated Learning for Wireless Computing Power Networks
    Sun, Wen
    Li, Zongjun
    Wang, Qubeijian
    Zhang, Yan
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (05) : 4257 - 4270
  • [5] RHFedMTL: Resource-Aware Hierarchical Federated Multitask Learning
    Yi, Xingfu
    Li, Rongpeng
    Peng, Chenghui
    Wang, Fei
    Wu, Jianjun
    Zhao, Zhifeng
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (14): : 25227 - 25238
  • [6] Resource-Aware Split Federated Learning for Edge Intelligence
    Arouj, Amna
    Abdelmoniem, Ahmed M.
    Alhilal, Ahmad
    You, Linlin
    Wang, Chen
    PROCEEDINGS 2024 IEEE 3RD WORKSHOP ON MACHINE LEARNING ON EDGE IN SENSOR SYSTEMS, SENSYS-ML 2024, 2024, : 15 - 20
  • [7] Resource-Aware Personalized Federated Learning Based on Reinforcement Learning
    Wu, Tingting
    Li, Xiao
    Gao, Pengpei
    Yu, Wei
    Xin, Lun
    Guo, Manxue
    IEEE COMMUNICATIONS LETTERS, 2025, 29 (01) : 175 - 179
  • [8] Resource-Aware Heterogeneous Federated Learning with Specialized Local Models
    Yu, Sixing
    Munoz, J. Pablo
    Jannesari, Ali
    EURO-PAR 2024: PARALLEL PROCESSING, PT I, EURO-PAR 2024, 2024, 14801 : 389 - 403
  • [9] Resource-Aware Asynchronous Online Federated Learning for Nonlinear Regression
    Gauthier, Francois
    Gogineni, Vinay Chakravarthi
    Werner, Stefan
    Huang, Yih-Fang
    Kuh, Anthony
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 2828 - 2833
  • [10] FedTeams: Towards Trust-Based and Resource-Aware Federated Learning
    Popovic, Dorde
    Gedawy, Hend K.
    Harras, Khaled A.
    2022 IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING TECHNOLOGY AND SCIENCE (CLOUDCOM 2022), 2022, : 121 - 128