Communication-Efficient Federated Learning in Drone-Assisted IoT Networks: Path Planning and Enhanced Knowledge Distillation Techniques

被引:4
|
作者
Gad, Gad [1 ]
Farrag, Aya [1 ]
Fadlullah, Zubair Md [2 ]
Fouda, Mostafa M. [3 ,4 ]
机构
[1] Lakehead Univ, Dept Comp Sci, Thunder Bay, ON, Canada
[2] Western Univ, Dept Comp Sci, London, ON, Canada
[3] Idaho State Univ, Dept Elect & Comp Engn, Pocatello, ID USA
[4] Ctr Adv Energy Studies CAES, Idaho Falls, ID USA
关键词
Deep learning; UAV networks; drone-aided LoRa networks; edge devices; Federated Learning (FL); digital health and well-being; Knowledge Distillation; Self-Organizing Maps;
D O I
10.1109/PIMRC56721.2023.10294036
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
As 5G and beyond networks continue to proliferate, intelligent monitoring systems are becoming increasingly prevalent. However, geographically isolated regions with sparse populations still face difficulties in accessing these technologies due to infrastructure deployment challenges. Additionally, the high cost and unreliability of satellite Internet services make them less appealing. This paper studies the challenges of droneaided networks and presents a communication-efficient Federated Learning (FL) system on a drone-aided Internet of Things (IoT) network to facilitate health analysis in rural areas over LoRa wireless links. The proposed approach consists of two primary components. Firstly, optimizing the drone's trajectory is theoretically formulated as a modified version of the Traveling Salesman Problem (TSP), with the Self-Organizing Map (SOM) algorithm employed for effective route planning. Secondly, the Knowledge Distillation (KD)-based FL algorithm is utilized to reduce communication overhead by leveraging soft labels. The quality of drone routes generated by the SOM is evaluated on multi-scale maps with pre-determined optimal paths. The experiments reveal SOM's ability to accurately represent node topologies and yield cost-effective Hamiltonian cycles. The KD-based FL proves to be more efficient in terms of communication than FedAvg as the former exchanges soft labels while the latter exchanges model weights, thus reducing drone waiting time and battery consumption. We showcase the performance of our KD-based FL algorithm using Human Activity Recognition (HAR) datasets, illustrating a communication-efficient alternative for distributed learning, offering competitive performance leveraging a shared dataset for knowledge transfer among IoT devices.
引用
下载
收藏
页数:7
相关论文
共 45 条
  • [1] Communication-efficient Federated Learning for UAV Networks with Knowledge Distillation and Transfer Learning
    Li, Yalong
    Wu, Celimuge
    Du, Zhaoyang
    Zhong, Lei
    Yoshinaga, Tsutomu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5739 - 5744
  • [2] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [3] Communication-Efficient Semihierarchical Federated Analytics in IoT Networks
    Zhao, Liang
    Valero, Maria
    Pouriyeh, Seyedamin
    Li, Lei
    Sheng, Quan Z.
    IEEE INTERNET OF THINGS JOURNAL, 2021, 9 (14) : 12614 - 12627
  • [4] Communication-Efficient Federated Learning for Digital Twin Edge Networks in Industrial IoT
    Lu, Yunlong
    Huang, Xiaohong
    Zhang, Ke
    Maharjan, Sabita
    Zhang, Yan
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (08) : 5709 - 5718
  • [5] FedGK: Communication-Efficient Federated Learning through Group-Guided Knowledge Distillation
    Zhang, Wenjun
    Liu, Xiaoli
    Tarkoma, Sasu
    ACM Transactions on Internet Technology, 2024, 24 (04)
  • [6] Communication-Efficient Federated Learning for Wireless Edge Intelligence in IoT
    Mills, Jed
    Hu, Jia
    Min, Geyong
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (07): : 5986 - 5994
  • [7] Communication-Efficient Federated Learning With Binary Neural Networks
    Yang, Yuzhi
    Zhang, Zhaoyang
    Yang, Qianqian
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3836 - 3850
  • [8] Joint Knowledge Distillation and Local Differential Privacy for Communication-Efficient Federated Learning in Heterogeneous Systems
    Gad, Gad
    Fadlullah, Zubair Md
    Fouda, Mostafa M.
    Ibrahem, Mohamed I.
    Nasser, Nidal
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 2051 - 2056
  • [9] Prototype Similarity Distillation for Communication-Efficient Federated Unsupervised Representation Learning
    Zhang C.
    Xie Y.
    Chen T.
    Mao W.
    Yu B.
    IEEE Transactions on Knowledge and Data Engineering, 2024, 36 (11) : 1 - 13
  • [10] Graph-Assisted Communication-Efficient Ensemble Federated Learning
    Ghari, Pouya M.
    Shen, Yanning
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 737 - 741