KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING

被引:0
|
作者
Mohamed, Ahmed P. [1 ]
Fameel, Abu Shafin Mohammad Mandee [1 ]
El Gamal, Aly [1 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
关键词
Frame Error Prediction; Knowledge Distillation; SMOTE; Edge Learning; Collaborative Spectrum Sharing;
D O I
10.1109/SSP49050.2021.9513752
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.
引用
收藏
页码:600 / 604
页数:5
相关论文
共 50 条
  • [1] Wireless Federated Distillation for Distributed Edge Learning with Heterogeneous Data
    Ahn, Jin-Hyun
    Simeone, Osvaldo
    Kang, Joonhyuk
    2019 IEEE 30TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (PIMRC), 2019, : 1138 - 1143
  • [2] Knowledge Distillation Assisted Robust Federated Learning: Towards Edge Intelligence
    Qiao, Yu
    Adhikary, Apurba
    Kim, Ki Tae
    Zhang, Chaoning
    Hong, Choong Seon
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 843 - 848
  • [3] Deep Knowledge Distillation Learning for Efficient Wearable Data Mining on the Edge
    Wong, Junhua
    Zhang, Qingxue
    2023 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, ICCE, 2023,
  • [4] Edge Computation-in-Memory for In-situ Classincremental Learning with Knowledge Distillation
    Yoshikiyo, Shinsei
    Misawa, Naoko
    Matsui, Chihiro
    Takeuchi, Ken
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2953 - 2957
  • [5] Federated learning by employing knowledge distillation on edge devices with limited hardware resources
    Tanghatari, Ehsan
    Kamal, Mehdi
    Afzali-Kusha, Ali
    Pedram, Massoud
    NEUROCOMPUTING, 2023, 531 : 87 - 99
  • [6] Heterogeneous Federated Learning via Generative Model-Aided Knowledge Distillation in the Edge
    Sun, Chuanneng
    Jiang, Tingcong
    Pompili, Dario
    IEEE INTERNET OF THINGS JOURNAL, 2025, 12 (05): : 5589 - 5599
  • [7] Edge-Computing-Based Knowledge Distillation and Multitask Learning for Partial Discharge Recognition
    Ji, Jinsheng
    Shu, Zhou
    Li, Hongqun
    Lai, Kai Xian
    Lu, Minshan
    Jiang, Guanlin
    Wang, Wensong
    Zheng, Yuanjin
    Jiang, Xudong
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [8] Personalized Federated Learning for Heterogeneous Edge Device: Self-Knowledge Distillation Approach
    Singh, Neha
    Rupchandani, Jatin
    Adhikari, Mainak
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 4625 - 4632
  • [9] Knowledge Distillation for Mobile Edge Computation Offloading
    CHEN Haowei
    ZENG Liekang
    YU Shuai
    CHEN Xu
    ZTECommunications, 2020, 18 (02) : 40 - 48
  • [10] Federated Learning With Selective Knowledge Distillation Over Bandwidth-constrained Wireless Networks
    Gad, Gad
    Fadlullah, Zubair Md
    Fouda, Mostafa M.
    Ibrahem, Mohamed I.
    Kato, Nei
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 3476 - 3481