KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING

被引:0
|
作者
Mohamed, Ahmed P. [1 ]
Fameel, Abu Shafin Mohammad Mandee [1 ]
El Gamal, Aly [1 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
关键词
Frame Error Prediction; Knowledge Distillation; SMOTE; Edge Learning; Collaborative Spectrum Sharing;
D O I
10.1109/SSP49050.2021.9513752
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.
引用
收藏
页码:600 / 604
页数:5
相关论文
共 50 条
  • [21] Skill enhancement learning with knowledge distillation
    Naijun LIU
    Fuchun SUN
    Bin FANG
    Huaping LIU
    Science China(Information Sciences), 2024, 67 (08) : 206 - 220
  • [22] Neuron Manifold Distillation for Edge Deep Learning
    Tao, Zeyi
    Xia, Qi
    Li, Qun
    2021 IEEE/ACM 29TH INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE (IWQOS), 2021,
  • [23] BookKD: A novel knowledge distillation for reducing distillation costs by decoupling knowledge generation and learning
    Zhu, Songling
    Shang, Ronghua
    Tang, Ke
    Xu, Songhua
    Li, Yangyang
    KNOWLEDGE-BASED SYSTEMS, 2023, 279
  • [24] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [25] MECKD: Deep Learning-Based Fall Detection in Multilayer Mobile Edge Computing With Knowledge Distillation
    Mao, Wei-Lung
    Wang, Chun-Chi
    Chou, Po-Heng
    Liu, Kai-Chun
    Tsao, Yu
    IEEE Sensors Journal, 2024, 24 (24) : 42195 - 42209
  • [26] Knowledge Distillation and Training Balance for Heterogeneous Decentralized Multi-Modal Learning Over Wireless Networks
    Yin, Benshun
    Chen, Zhiyong
    Tao, Meixia
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9629 - 9644
  • [27] A tucker decomposition based knowledge distillation for intelligent edge applications
    Dai, Cheng
    Liu, Xingang
    Li, Zhuolin
    Chen, Mu-Yen
    APPLIED SOFT COMPUTING, 2021, 101
  • [28] Generative Adversarial Super-Resolution at the edge with knowledge distillation
    Angarano, Simone
    Salvetti, Francesco
    Martini, Mauro
    Chiaberge, Marcello
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 123
  • [29] Personalized Decentralized Federated Learning with Knowledge Distillation
    Jeong, Eunjeong
    Kountouris, Marios
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1982 - 1987
  • [30] Heterogeneous Knowledge Distillation Using Conceptual Learning
    Yu, Yerin
    Kim, Namgyu
    IEEE ACCESS, 2024, 12 : 52803 - 52814