Adaptive Federated Learning With Gradient Compression in Uplink NOMA

被引:69
|
作者
Sun, Haijian [1 ]
Ma, Xiang [2 ]
Hu, Rose Qingyang [2 ]
机构
[1] Univ Wisconsin, Dept Comp Sci, Whitewater, WI 53190 USA
[2] Utah State Univ, Elect & Comp Engn Dept, Logan, UT 84322 USA
基金
美国国家科学基金会;
关键词
Federated learning; NOMA; adaptive wireless update; gradient compression;
D O I
10.1109/TVT.2020.3027306
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) is an emerging machine learning technique that aggregates model attributes from a large number of distributed devices. Compared with the traditional centralized machine learning, FL uploads only model parameters rather than raw data during the learning process. Although distributed computing can lower down the information that needs to be uploaded, model updates in FL can still experience performance bottleneck, especially when training deep learning models in distributed networks. In this work, we investigate the performance of FL update at mobile edge devices that are connected to the parameter server (PS) with wireless links. Considering the spectrum limitation on the wireless fading channels, we further exploit non-orthogonal multiple access (NOMA) together with adaptive gradient quantization and sparsification to facilitate efficient uplink FL updates. Simulation results show that the proposed scheme can significantly reduce FL aggregation latency but still achieve a comparable accuracy with benchmark schemes.
引用
收藏
页码:16325 / 16329
页数:5
相关论文
共 50 条
  • [1] ClusterGrad: Adaptive Gradient Compression by Clustering in Federated Learning
    Cui, Laizhong
    Su, Xiaoxin
    Zhou, Yipeng
    Zhang, Lei
    2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [2] Adaptive Batchsize Selection and Gradient Compression for Wireless Federated Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    Qu, Fengzhong
    2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [3] Fast Uplink Grant for NOMA: A Federated Learning Based Approach
    Habachi, Oussama
    Adjif, Mohamed-Ali
    Cances, Jean-Pierre
    UBIQUITOUS NETWORKING, UNET 2019, 2020, 12293 : 96 - 109
  • [4] Research on Efficient Federated Learning Communication Mechanism Based on Adaptive Gradient Compression br
    Tang, Lun
    Wang, Zhiping
    Pu, Hao
    Wu, Zhuang
    Chen, Qianbin
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2023, 45 (01) : 227 - 234
  • [5] Intrinsic Gradient Compression for Scalable and Efficient Federated Learning
    Melas-Kyriazi, Luke
    Wang, Franklyn
    PROCEEDINGS OF THE FIRST WORKSHOP ON FEDERATED LEARNING FOR NATURAL LANGUAGE PROCESSING (FL4NLP 2022), 2022, : 27 - 41
  • [6] Wyner-ziv gradient compression for federated learning
    Liang, Kai
    Zhong, Huiru
    Chen, Haoning
    Wu, Youlong
    arXiv, 2021,
  • [7] Gradient Compression with a Variational Coding Scheme for Federated Learning
    Kathariya, Birendra
    Li, Zhu
    Chen, Jianle
    Van der Auwera, Geert
    2021 INTERNATIONAL CONFERENCE ON VISUAL COMMUNICATIONS AND IMAGE PROCESSING (VCIP), 2021,
  • [8] RFCSC: Communication efficient reinforcement federated learning with dynamic client selection and adaptive gradient compression
    Pan, Zhenhui
    Li, Yawen
    Guan, Zeli
    Liang, Meiyu
    Li, Ang
    Wang, Jia
    Kou, Feifei
    NEUROCOMPUTING, 2025, 612
  • [9] Decentralised federated learning with adaptive partial gradient aggregation
    Jiang, Jingyan
    Hu, Liang
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2020, 5 (03) : 230 - 236
  • [10] Adaptive Compression in Federated Learning via Side Information
    Isik, Berivan
    Pase, Francesco
    Gunduz, Deniz
    Koyejo, Sanmi
    Weissman, Tsachy
    Zorzi, Michele
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238