Communication-Efficient Personalized Federated Edge Learning for Massive MIMO CSI Feedback

被引:2
|
作者
Cui, Yiming [1 ]
Guo, Jiajia [1 ]
Wen, Chao-Kai [2 ]
Jin, Shi [1 ]
机构
[1] Southeast Univ, Natl Mobile Commun Res Lab, Nanjing 210096, Peoples R China
[2] Natl Sun Yat Sen Univ, Inst Commun Engn, Kaohsiung 80424, Taiwan
基金
中国国家自然科学基金;
关键词
Training; Correlation; Downlink; Servers; Data privacy; Antenna arrays; Uplink; Massive MIMO; CSI feedback; federated edge learning; neural network quantization; personalization; CHANNEL RECIPROCITY;
D O I
10.1109/TWC.2023.3339824
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Deep learning (DL)-based channel state information (CSI) feedback has garnered significant research attention in recent years. However, previous research has overlooked the potential privacy disclosure problem caused by transmitting CSI datasets during the training process. In this study, we introduce a federated edge learning (FEEL)-based training framework for DL-based CSI feedback. This approach differs from the conventional centralized learning (CL)-based framework, where the CSI datasets are collected at the base station (BS) before training. Instead, each user equipment (UE) trains a local autoencoder network and exchanges model parameters with the BS. This approach provides better protection for data privacy compared to CL. To further reduce communication overhead in FEEL, we quantize the uplink and downlink model transmission into different bits based on their influence on feedback performance. Additionally, since the heterogeneity of CSI datasets among different UEs can degrade the performance of the FEEL-based framework, we introduce a personalization strategy to enhance feedback performance. This strategy allows for local fine-tuning to adapt the global model to the channel characteristics of each UE. Simulation results indicate that the proposed personalized FEEL-based training framework can significantly improve the performance of DL-based CSI feedback while reducing communication overhead.
引用
收藏
页码:7362 / 7375
页数:14
相关论文
共 50 条
  • [41] Communication-Efficient and Byzantine-Robust Federated Learning for Mobile Edge Computing Networks
    Zhang, Zhuangzhuang
    Wl, Libing
    He, Debiao
    Li, Jianxin
    Cao, Shuqin
    Wu, Xianfeng
    [J]. IEEE NETWORK, 2023, 37 (04): : 112 - 119
  • [42] Communication-Efficient and Model-Heterogeneous Personalized Federated Learning via Clustered Knowledge Transfer
    Cho, Yae Jee
    Wang, Jianyu
    Chirvolu, Tarun
    Joshi, Gauri
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2023, 17 (01) : 234 - 247
  • [43] Communication-efficient and Utility-Aware Adaptive Gaussian Differential Privacy for Personalized Federated Learning
    Li, Min
    Xiao, Di
    Chen, Lü-Jun
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2024, 47 (04): : 924 - 946
  • [44] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [45] Communication-Efficient Federated Learning With Binary Neural Networks
    Yang, Yuzhi
    Zhang, Zhaoyang
    Yang, Qianqian
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3836 - 3850
  • [46] Communication-Efficient Consensus Mechanism for Federated Reinforcement Learning
    Xu, Xing
    Li, Rongpeng
    Zhao, Zhifeng
    Zhang, Honggang
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 80 - 85
  • [47] On the Design of Communication-Efficient Federated Learning for Health Monitoring
    Chu, Dong
    Jaafar, Wael
    Yanikomeroglu, Halim
    [J]. 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1128 - 1133
  • [48] Federated Learning with Autotuned Communication-Efficient Secure Aggregation
    Bonawitz, Keith
    Salehi, Fariborz
    Konecny, Jakub
    McMahan, Brendan
    Gruteser, Marco
    [J]. CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 1222 - 1226
  • [49] ALS Algorithm for Robust and Communication-Efficient Federated Learning
    Hurley, Neil
    Duriakova, Erika
    Geraci, James
    O'Reilly-Morgan, Diarmuid
    Tragos, Elias
    Smyth, Barry
    Lawlor, Aonghus
    [J]. PROCEEDINGS OF THE 2024 4TH WORKSHOP ON MACHINE LEARNING AND SYSTEMS, EUROMLSYS 2024, 2024, : 56 - 64
  • [50] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2022, 13 (01)