Base Station Dataset-Assisted Broadband Over-the-Air Aggregation for Communication-Efficient Federated Learning

被引:3
|
作者
Hong, Jun-Pyo [1 ]
Park, Sangjun [2 ,3 ]
Choi, Wan [4 ]
机构
[1] Pukyong Natl Univ, Dept Informat & Commun Engn, Busan 48513, South Korea
[2] Korea Adv Inst Sci & Technol, Sch Elect Engn, Daejeon 34141, South Korea
[3] Seoul Natl Univ, Inst New Media & Commun, Seoul 08826, South Korea
[4] Seoul Natl Univ, Inst New Media & Commun, Dept Elect & Comp Engn, Seoul 08826, South Korea
基金
新加坡国家研究基金会;
关键词
Convergence; Power control; Training; Distortion; Data models; Computational modeling; Broadband communication; Federated learning; over-the-air aggregation; dataset of base station; optimized power control; compressed update report; MULTIPLE-ACCESS; POWER-CONTROL; CONVERGENCE; ALLOCATION;
D O I
10.1109/TWC.2023.3249252
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
This paper proposes an over-the-air aggregation framework for federated learning (FL) in broadband wireless networks where not only edge devices but also a base station (BS) has its own local dataset. The proposed framework leverages the BS dataset to improve communication efficiency of FL by reducing the number of channel uses required for the model convergence as well as avoiding the signaling overhead incurred by power scale coordination among edge devices. We analyze the convergence to a stationary point without convexity assumption on the objective function. The analysis result reveals that the utilization of BS dataset improves the convergence rate and the update distortion caused by the limited power budget is a crucial factor hindering the model convergence. To facilitate the convergence, we develop an optimized power control method by solving the distortion minimization problem without assumptions on power scale coordination and global CSI at BS. Our simulation results validate that BS dataset is beneficial to reducing the number of channel uses for the model convergence and the developed power control method outperforms the conventional method in terms of both convergence rate and converged test accuracy. Furthermore, we identify some scenarios where the compression of local update can be helpful to reduce communication resources for model training.
引用
收藏
页码:7259 / 7272
页数:14
相关论文
共 50 条
  • [31] Communication-Efficient Model Aggregation With Layer Divergence Feedback in Federated Learning
    Wang, Liwei
    Li, Jun
    Chen, Wen
    Wu, Qingqing
    Ding, Ming
    IEEE COMMUNICATIONS LETTERS, 2024, 28 (10) : 2293 - 2297
  • [32] Over-the-Air Aggregation for Federated Learning: Waveform Superposition and Prototype Validation
    Guo, Huayan
    Zhu, Yifan
    Ma, Haoyu
    Lau, Vincent K. N.
    Huang, Kaibin
    Li, Xiaofan
    Nong, Huabin
    Zhou, Mingyu
    Journal of Communications and Information Networks, 2021, 6 (04) : 429 - 442
  • [33] Deep Compression for Efficient and Accelerated Over-the-Air Federated Learning
    Khan, Fazal Muhammad Ali
    Abou-Zeid, Hatem
    Hassan, Syed Ali
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (15): : 25802 - 25817
  • [34] Beamforming and Device Selection Design in Federated Learning With Over-the-Air Aggregation
    Kalarde, Faeze Moradi
    Dong, Min
    Liang, Ben
    Ahmed, Yahia A. Eldemerdash
    Cheng, Ho Ting
    IEEE OPEN JOURNAL OF THE COMMUNICATIONS SOCIETY, 2024, 5 : 1710 - 1723
  • [35] Power Minimization in Federated Learning with Over-the-air Aggregation and Receiver Beamforming
    Kalarde, Faeze Moradi
    Liang, Ben
    Dong, Min
    Ahmed, Yahia A. Eldemerdash
    Cheng, Ho Ting
    PROCEEDINGS OF THE INT'L ACM CONFERENCE ON MODELING, ANALYSIS AND SIMULATION OF WIRELESS AND MOBILE SYSTEMS, MSWIM 2023, 2023, : 259 - 267
  • [36] Communication-Efficient and Attack-Resistant Federated Edge Learning With Dataset Distillation
    Zhou, Yanlin
    Ma, Xiyao
    Wu, Dapeng
    Li, Xiaolin
    IEEE TRANSACTIONS ON CLOUD COMPUTING, 2023, 11 (03) : 2517 - 2528
  • [37] AFedAvg: communication-efficient federated learning aggregation with adaptive communication frequency and gradient sparse
    Li, Yanbin
    He, Ziming
    Gu, Xingjian
    Xu, Huanliang
    Ren, Shougang
    JOURNAL OF EXPERIMENTAL & THEORETICAL ARTIFICIAL INTELLIGENCE, 2024, 36 (01) : 47 - 69
  • [38] Scalable and Resource-Efficient Second-Order Federated Learning via Over-the-Air Aggregation
    Ghalkha, Abdulmomen
    Ben Issaid, Chaouki
    Bennis, Mehdi
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2025, 14 (03) : 716 - 720
  • [39] Communication-Efficient Distributed SGD Using Random Access for Over-the-Air Computation
    Choi J.
    IEEE Journal on Selected Areas in Information Theory, 2022, 3 (02): : 206 - 216
  • [40] Federated Learning via Active RIS Assisted Over-the-Air Computation
    Zhang, Deyou
    Xiao, Ming
    Skoglund, Mikael
    Poor, H. Vincent
    2024 IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING FOR COMMUNICATION AND NETWORKING, ICMLCN 2024, 2024, : 201 - 207