Lightweight Neural Network With Knowledge Distillation for CSI Feedback

被引:3
|
作者
Cui, Yiming [1 ]
Guo, Jiajia [1 ]
Cao, Zheng [1 ]
Tang, Huaze [1 ]
Wen, Chao-Kai [2 ]
Jin, Shi [1 ]
Wang, Xin [3 ]
Hou, Xiaolin [3 ]
机构
[1] Southeast Univ, Natl Mobile Commun Res Lab, Nanjing 210096, Peoples R China
[2] Natl Sun Yat Sen Univ, Inst Commun Engn, Kaohsiung 80424, Taiwan
[3] DOCOMO Beijing Commun Labs Co Ltd, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Massive MIMO; CSI feedback; neural network lightweight; knowledge distillation; MIMO; COMPRESSION; WIRELESS;
D O I
10.1109/TCOMM.2024.3377724
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep learning has shown promise in enhancing channel state information (CSI) feedback. However, many studies indicate that better feedback performance often accompanies higher computational complexity. Pursuing better performance-complexity tradeoffs is crucial to facilitate practical deployment, especially on computation-limited devices, which may have to use lightweight autoencoder with unfavorable performance. To achieve this goal, this paper introduces knowledge distillation (KD) to achieve better tradeoffs, where knowledge from a complicated teacher autoencoder is transferred to a lightweight student autoencoder for performance improvement. Specifically, two methods are proposed for implementation. Firstly, an autoencoder KD-based method is introduced by training a student autoencoder to mimic the reconstructed CSI of a pretrained teacher autoencoder. Secondly, an encoder KD-based method is proposed to reduce training overhead by performing KD only on the student encoder. Additionally, a variant of encoder KD is introduced to protect user equipment and base station vendor intellectual property. Numerical simulations demonstrate that the proposed methods can significantly improve the student autoencoder's performance, while reducing the number of floating point operations and inference time to 3.05%-5.28% and 13.80%-14.76% of the teacher network, respectively. Furthermore, the variant encoder KD method effectively enhances the student autoencoder's generalization capability across different scenarios, environments, and bandwidths.
引用
收藏
页码:4917 / 4929
页数:13
相关论文
共 50 条
  • [41] CSI Feedback Prediction using UE Aware Sparse Neural Network Framework
    Singh, Sukhdeep
    Kumar, Swaraj
    Saha, Rahul Kumar
    Agarwal, Shreyanshu
    Kaur, Ashmeet
    2024 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS WORKSHOPS, ICC WORKSHOPS 2024, 2024, : 2011 - 2016
  • [42] A Lightweight Method for Graph Neural Networks Based on Knowledge Distillation and Graph Contrastive Learning
    Wang, Yong
    Yang, Shuqun
    APPLIED SCIENCES-BASEL, 2024, 14 (11):
  • [43] CSI Compression and Feedback for Network MIMO
    Kurras, Martin
    Jaeckel, Stephan
    Thiele, Lars
    Braun, Volker
    2015 IEEE 81ST VEHICULAR TECHNOLOGY CONFERENCE (VTC SPRING), 2015,
  • [44] A Lightweight SHM Framework Based on Adaptive Multisensor Fusion Network and Multigeneration Knowledge Distillation
    Li, Sijue
    Liu, Fengyi
    Peng, Gaoliang
    Cheng, Feng
    Zhao, Benqi
    Ji, Mengyu
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [45] Boosting Lightweight CNNs Through Network Pruning and Knowledge Distillation for SAR Target Recognition
    Wang, Zhen
    Du, Lan
    Li, Yi
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2021, 14 : 8386 - 8397
  • [46] Lightweight Dual Stream Network With Knowledge Distillation for RGB-D Scene Parsing
    Zhang, Yuming
    Zhou, Wujie
    Ran, Xiaoxiao
    Fang, Meixin
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 855 - 859
  • [47] KD-LightNet: A Lightweight Network Based on Knowledge Distillation for Industrial Defect Detection
    Liu, Jinhai
    Li, Hengguang
    Zuo, Fengyuan
    Zhao, Zhen
    Lu, Senxiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [48] Network lightweight method based on knowledge distillation is applied to RV reducer fault diagnosis
    He, Feifei
    Liu, Chang
    Wang, Mengdi
    Yang, Enshan
    Liu, Xiaoqin
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (09)
  • [49] Toward Efficient Image Denoising: A Lightweight Network with Retargeting Supervision Driven Knowledge Distillation
    Zou, Beiji
    Zhang, Yue
    Wang, Min
    Liu, Shu
    ADVANCES IN COMPUTER GRAPHICS, CGI 2022, 2022, 13443 : 15 - 27
  • [50] Model-Driven Lightweight Network for CSI Feedback in Time-Varying Massive MIMO Systems
    Zhang, Yangyang
    Zhang, Xichang
    Liu, Yi
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 4961 - 4966