Communication Traffic Prediction with Continual Knowledge Distillation

被引:1
|
作者
Li, Hang [1 ]
Wang, Ju [1 ]
Hu, Chengming [1 ]
Chen, Xi [1 ]
Liu, Xue [1 ]
Jang, Seowoo [2 ]
Dudek, Gregory [1 ]
机构
[1] Samsung Elect, Mississauga, ON, Canada
[2] Samsung Elect, Suwon, South Korea
关键词
traffic prediction; knowledge distillation;
D O I
10.1109/ICC45855.2022.9838521
中图分类号
TN [电子技术、通信技术];
学科分类号
0809 ;
摘要
Accurate traffic volume estimation and prediction are essential for advanced communication network functions, such as automatic operations and predictive resource allocation. Although machine learning (ML)-based approaches achieve great success in accomplishing this goal, existing approaches suffer from two drawbacks that limit their real-world applications. First, the ML-based prediction models developed in the past might be obsolete now, since the communication traffic patterns and volumes keep changing in the real world, leading to prediction errors. Second, most Base Stations (BSs) can only save a small amount of data due to the limited storage capacity and high storage costs, which prevents from training an accurate prediction model. In this paper, we propose a novel framework that adapts the prediction model to the constantly changing traffic with only a few current traffic data. Specifically, the framework first learns the knowledge of historical traffic data as much as possible by using a proposed two-branch neural network design, which includes a prediction and a reconstruction module. Then, the framework transfers the knowledge from an old (past) prediction model to a new (current) model for the model update by using a proposed continual knowledge distillation technique. Evaluations on a real-world dataset show that the proposed framework reduces the Mean Absolute Error (MAE) of traffic prediction by up to 9.62% compared to the state-of-the-art prediction methods.
引用
收藏
页码:5481 / 5486
页数:6
相关论文
共 50 条
  • [31] Ultrafast Video Attention Prediction with Coupled Knowledge Distillation
    Fu, Kui
    Shi, Peipei
    Song, Yafei
    Ge, Shiming
    Lu, Xiangju
    Li, Jia
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 10802 - 10809
  • [32] Knowledge Distillation for Energy Consumption Prediction in Additive Manufacturing
    Li, Yixin
    Hu, Fu
    Ryan, Michael
    Wang, Ray
    Liu, Ying
    [J]. IFAC PAPERSONLINE, 2022, 55 (02): : 390 - 395
  • [33] Knowledge Distillation-Based Spatio-Temporal MLP Model for Real-Time Traffic Flow Prediction
    Zhang, Junfeng
    Xie, Cheng
    Cai, Hongming
    Shen, Weiming
    Yang, Rui
    [J]. IEEE Transactions on Intelligent Transportation Systems, 2024, 25 (11) : 18122 - 18135
  • [34] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2022, 13 (01)
  • [35] A rolling bearing fault diagnosis method based on multi-scale knowledge distillation and continual learning
    Xia, Yifei
    Gao, Jun
    Shao, Xing
    Wang, Cuixiang
    [J]. Zhendong yu Chongji/Journal of Vibration and Shock, 2024, 43 (12): : 276 - 285
  • [36] Contrastive Supervised Distillation for Continual Representation Learning
    Barletti, Tommaso
    Biondi, Niccolo
    Pernici, Federico
    Bruni, Matteo
    Del Bimbo, Alberto
    [J]. IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT I, 2022, 13231 : 597 - 609
  • [37] An explainable knowledge distillation method with XGBoost for ICU mortality prediction
    Liu, Mucan
    Guo, Chonghui
    Guo, Sijia
    [J]. COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 152
  • [38] Regularizing Brain Age Prediction via Gated Knowledge Distillation
    Yang, Yanwu
    Guo, Xutao
    Ye, Chenfei
    Xiang, Yang
    Ma, Ting
    [J]. INTERNATIONAL CONFERENCE ON MEDICAL IMAGING WITH DEEP LEARNING, VOL 172, 2022, 172 : 1430 - 1443
  • [39] Multi-Task Learning with Knowledge Distillation for Dense Prediction
    Xu, Yangyang
    Yang, Yibo
    Zhang, Lefei
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21493 - 21502
  • [40] Multi-Task Knowledge Distillation for Eye Disease Prediction
    Chelaramani, Sahil
    Gupta, Manish
    Agarwal, Vipul
    Gupta, Prashant
    Habash, Ranya
    [J]. 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3982 - 3992