KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING

被引:0
|
作者
Mohamed, Ahmed P. [1 ]
Fameel, Abu Shafin Mohammad Mandee [1 ]
El Gamal, Aly [1 ]
机构
[1] Purdue Univ, Sch Elect & Comp Engn, W Lafayette, IN 47907 USA
关键词
Frame Error Prediction; Knowledge Distillation; SMOTE; Edge Learning; Collaborative Spectrum Sharing;
D O I
10.1109/SSP49050.2021.9513752
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In this paper, we propose a framework for predicting frame errors in the collaborative spectrally congested wireless environments of the DARPA Spectrum Collaboration Challenge (SC2) via a recently collected dataset. We employ distributed deep edge learning that is shared among edge nodes and a central cloud. Using this close-to-practice dataset, we find that widely used federated learning approaches, specially those that are privacy preserving, are worse than local training for a wide range of settings. We hence utilize the synthetic minority oversampling technique to maintain privacy via avoiding the transfer of local data to the cloud, and utilize knowledge distillation with an aim to benefit from high cloud computing and storage capabilities. The proposed framework achieves overall better performance than both local and federated training approaches, while being robust against catastrophic failures as well as challenging channel conditions that result in high frame error rates.
引用
收藏
页码:600 / 604
页数:5
相关论文
共 50 条
  • [41] Knowledge Distillation in Federated Learning: A Practical Guide
    Mora, Alessio
    Tenison, Irene
    Bellavista, Paolo
    Rish, Irina
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 8188 - 8196
  • [42] Autoregressive Knowledge Distillation through Imitation Learning
    Lin, Alexander
    Wohlwend, Jeremy
    Chen, Howard
    Lei, Tao
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6121 - 6133
  • [43] Smarter peer learning for online knowledge distillation
    Lin, Yu-e
    Liang, Xingzhu
    Hu, Gan
    Fang, Xianjin
    MULTIMEDIA SYSTEMS, 2022, 28 (03) : 1059 - 1067
  • [44] Positive-Unlabeled Learning for Knowledge Distillation
    Jiang, Ning
    Tang, Jialiang
    Yu, Wenxin
    NEURAL PROCESSING LETTERS, 2023, 55 (03) : 2613 - 2631
  • [45] FedDKD: Federated learning with decentralized knowledge distillation
    Xinjia Li
    Boyu Chen
    Wenlian Lu
    Applied Intelligence, 2023, 53 : 18547 - 18563
  • [46] Peer Collaborative Learning for Online Knowledge Distillation
    Wu, Guile
    Gong, Shaogang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10302 - 10310
  • [47] Complementary label learning based on knowledge distillation
    Ying, Peng
    Li, Zhongnian
    Sun, Renke
    Xu, Xinzheng
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (10) : 17905 - 17918
  • [48] Knowledge distillation for incremental learning in semantic segmentation
    Michieli, Umberto
    Zanuttigh, Pietro
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2021, 205
  • [49] Incremental attribute learning by knowledge distillation method
    Kuang, Zhejun
    Wang, Jingrui
    Sun, Dawen
    Zhao, Jian
    Shi, Lijuan
    Xiong, Xingbo
    JOURNAL OF COMPUTATIONAL DESIGN AND ENGINEERING, 2024, 11 (05) : 259 - 283
  • [50] Smarter peer learning for online knowledge distillation
    Yu-e Lin
    Xingzhu Liang
    Gan Hu
    Xianjin Fang
    Multimedia Systems, 2022, 28 : 1059 - 1067