KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING

被引:0
|
作者
Mohamed, Ahmed P. [1 ]
Fameel, Abu Shafin Mohammad Mandee [1 ]
El Gamal, Aly [1 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
关键词
Frame Error Prediction; Knowledge Distillation; SMOTE; Edge Learning; Collaborative Spectrum Sharing;
D O I
10.1109/SSP49050.2021.9513752
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.
引用
收藏
页码:600 / 604
页数:5
相关论文
共 50 条
  • [31] Boosting Contrastive Learning with Relation Knowledge Distillation
    Zheng, Kai
    Wang, Yuanjiang
    Yuan, Ye
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 3508 - 3516
  • [32] Multimodal Learning with Incomplete Modalities by Knowledge Distillation
    Wang, Qi
    Zhan, Liang
    Thompson, Paul
    Zhou, Jiayu
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1828 - 1838
  • [33] WHEN FEDERATED LEARNING MEETS KNOWLEDGE DISTILLATION
    Pang, Xiaoyi
    Hu, Jiahui
    Sun, Peng
    Ren, Ju
    Wang, Zhibo
    IEEE WIRELESS COMMUNICATIONS, 2024, 31 (05) : 208 - 214
  • [34] Knowledge distillation in deep learning and its applications
    Alkhulaifi, Abdolmaged
    Alsahli, Fahad
    Ahmad, Irfan
    PEERJ COMPUTER SCIENCE, 2021, PeerJ Inc. (07) : 1 - 24
  • [35] Elastic Knowledge Distillation by Learning From Recollection
    Fu, Yongjian
    Li, Songyuan
    Zhao, Hanbin
    Wang, Wenfu
    Fang, Weihao
    Zhuang, Yueting
    Pan, Zhijie
    Li, Xi
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2647 - 2658
  • [36] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [37] FedDKD: Federated learning with decentralized knowledge distillation
    Li, Xinjia
    Chen, Boyu
    Lu, Wenlian
    APPLIED INTELLIGENCE, 2023, 53 (15) : 18547 - 18563
  • [38] Compact CNN Structure Learning by Knowledge Distillation
    Ahmed, Waqar
    Zunino, Andrea
    Morerio, Pietro
    Murino, Vittorio
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6554 - 6561
  • [39] Positive-Unlabeled Learning for Knowledge Distillation
    Ning Jiang
    Jialiang Tang
    Wenxin Yu
    Neural Processing Letters, 2023, 55 : 2613 - 2631
  • [40] Learning Lightweight Face Detector with Knowledge Distillation
    Jin, Haibo
    Zhang, Shifeng
    Zhu, Xiangyu
    Tang, Yinhang
    Lei, Zhen
    Li, Stan Z.
    2019 INTERNATIONAL CONFERENCE ON BIOMETRICS (ICB), 2019,