Communication-Efficient Federated Learning with Adaptive Quantization

被引:25
|
作者
Mao, Yuzhu [1 ,2 ]
Zhao, Zihao [1 ,2 ]
Yan, Guangfeng [3 ,4 ]
Liu, Yang [5 ]
Lan, Tian [6 ]
Song, Linqi [3 ,4 ]
Ding, Wenbo [1 ,2 ,7 ]
机构
[1] Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Lishui Rd, Shenzhen 518055, Guangdong, Peoples R China
[2] Tsinghua Univ, Tsinghua Berkeley Shenzhen Inst TBSI, Lishui Rd, Shenzhen 518055, Guangdong, Peoples R China
[3] City Univ Hong Kong, Kowloon, Tat Chee Ave, Hong Kong, Peoples R China
[4] City Univ Hong Kong, Shenzhen Res Inst, Yuexing First Rd, Shenzhen 518057, Guangdong, Peoples R China
[5] Tsinghua Univ, Inst AI Ind Res, Shuangqing Rd, Beijing 100190, Peoples R China
[6] George Washington Univ, Dept Elect & Comp Engn, 1918 F St NW, Washington, DC 20052 USA
[7] RISC V Int Open Source Lab, Lishui Rd, Shenzhen 518055, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; information compression; communication efficiency;
D O I
10.1145/3510587
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) has attracted tremendous attentions in recent years due to its privacy-preserving measures and great potential in some distributed but privacy-sensitive applications, such as finance and health. However, high communication overloads for transmitting high-dimensional networks and extra security masks remain a bottleneck of FL. This article proposes a communication-efficient FL framework with an Adaptive Quantized Gradient (AQG), which adaptively adjusts the quantization level based on a local gradient's update to fully utilize the heterogeneity of local data distribution for reducing unnecessary transmissions. In addition, client dropout issues are taken into account and an Augmented AQG is developed, which could limit the dropout noise with an appropriate amplification mechanism for transmitted gradients. Theoretical analysis and experiment results show that the proposed AQG leads to 18% to 50% of additional transmission reduction as compared with existing popular methods, including Quantized Gradient Descent (QGD) and Lazily Aggregated Quantized (LAQ) gradient-based methods without deteriorating convergence properties. Experiments with heterogenous data distributions corroborate a more significant transmission reduction compared with independent identical data distributions. The proposed AQG is robust to a client dropping rate up to 90% empirically, and the Augmented AQG manages to further improve the FL system's communication efficiency with the presence of moderate-scale client dropouts commonly seen in practical FL scenarios.
引用
收藏
页数:26
相关论文
共 50 条
  • [1] ADAPTIVE QUANTIZATION OF MODEL UPDATES FOR COMMUNICATION-EFFICIENT FEDERATED LEARNING
    Jhunjhunwala, Divyansh
    Gadhikar, Advait
    Joshi, Gauri
    Eldar, Yonina C.
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3110 - 3114
  • [2] DAdaQuant: Doubly-adaptive quantization for communication-efficient Federated Learning
    Hoenig, Robert
    Zhao, Yiren
    Mullins, Robert
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [3] Communication-Efficient Adaptive Federated Learning
    Wang, Yujia
    Lin, Lu
    Chen, Jinghui
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [4] FedDQ: Communication-Efficient Federated Learning with Descending Quantization
    Qu, Linping
    Song, Shenghui
    Tsui, Chi-Ying
    [J]. 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 281 - 286
  • [5] Communication-Efficient Federated Learning with Adaptive Consensus ADMM
    He, Siyi
    Zheng, Jiali
    Feng, Minyu
    Chen, Yixin
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (09):
  • [6] Communication-Efficient Federated Learning with Adaptive Parameter Freezing
    Chen, Chen
    Xu, Hong
    Wang, Wei
    Li, Baochun
    Li, Bo
    Chen, Li
    Zhang, Gong
    [J]. 2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, : 1 - 11
  • [7] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [8] FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
    Reisizadeh, Amirhossein
    Mokhtari, Aryan
    Hassani, Hamed
    Jadbabaie, Ali
    Pedarsani, Ramtin
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2021 - 2030
  • [9] Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning
    Becking, Daniel
    Kirchhoffer, Heiner
    Tech, Gerhard
    Haase, Paul
    Mueller, Karsten
    Schwarz, Heiko
    Samek, Wojciech
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 3366 - 3375
  • [10] Selective Updates and Adaptive Masking for Communication-Efficient Federated Learning
    Herzog, Alexander
    Southam, Robbie
    Belarbi, Othmane
    Anwar, Saif
    Bullo, Marcello
    Carnelli, Pietro
    Khan, Aftab
    [J]. IEEE TRANSACTIONS ON GREEN COMMUNICATIONS AND NETWORKING, 2024, 8 (02): : 852 - 864