Communication-Efficient Agnostic Federated Averaging

被引:0
|
作者
Ro, Jae [1 ]
Chen, Mingqing [1 ]
Mathews, Rajiv [1 ]
Mohri, Mehryar [1 ,2 ]
Suresh, Ananda Theertha [1 ]
机构
[1] Google Inc, Mountain View, CA 94043 USA
[2] Courant Inst Math Sci, New York, NY USA
来源
关键词
D O I
10.21437/Interspeech.2021-153
中图分类号
R36 [病理学]; R76 [耳鼻咽喉科学];
学科分类号
100104 ; 100213 ;
摘要
In distributed learning settings such as federated learning, the training algorithm can be potentially biased towards different clients. [1] proposed a domain-agnostic learning algorithm, where the model is optimized for any target distribution formed by a mixture of the client distributions in order to overcome this bias. They further proposed an algorithm for the cross-silo federated learning setting, where the number of clients is small. We consider this problem in the cross-device setting, where the number of clients is much larger. We propose a communication-efficient distributed algorithm called AGNOSTIC FEDERATED AVERAGING (or AGNOSTICFEDAVG) to minimize the domain-agnostic objective proposed in [1], which is amenable to other private mechanisms such as secure aggregation. We highlight two types of naturally occurring domains in federated learning and argue that AGNOSTICFEDAVG performs well on both. To demonstrate the practical effectiveness of AGNOSTICFEDAVG, we report positive results for large-scale language modeling tasks in both simulation and live experiments, where the latter involves training language models for Spanish virtual keyboard for millions of user devices.
引用
收藏
页码:871 / 875
页数:5
相关论文
共 50 条
  • [1] DEMYSTIFYING MODEL AVERAGING FOR COMMUNICATION-EFFICIENT FEDERATED MATRIX FACTORIZATION
    Wang, Shuai
    Suwandi, Richard Cornelius
    Chang, Tsung-Hui
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3680 - 3684
  • [2] FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
    Reisizadeh, Amirhossein
    Mokhtari, Aryan
    Hassani, Hamed
    Jadbabaie, Ali
    Pedarsani, Ramtin
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 2021 - 2030
  • [3] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [4] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    [J]. ALGORITHMS, 2022, 15 (08)
  • [5] Communication-Efficient Adaptive Federated Learning
    Wang, Yujia
    Lin, Lu
    Chen, Jinghui
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [6] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    [J]. ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [7] Communication-Efficient Federated Double Distillation in IoV
    Yang, Peng
    Yan, Mengjiao
    Cui, Yaping
    He, Peng
    Wu, Dapeng
    Wang, Ruyan
    Chen, Luo
    [J]. IEEE TRANSACTIONS ON COGNITIVE COMMUNICATIONS AND NETWORKING, 2023, 9 (05) : 1340 - 1352
  • [8] Communication-Efficient Federated Learning with Adaptive Quantization
    Mao, Yuzhu
    Zhao, Zihao
    Yan, Guangfeng
    Liu, Yang
    Lan, Tian
    Song, Linqi
    Ding, Wenbo
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [9] FedAGL: A Communication-Efficient Federated Vehicular Network
    Liu, Su
    Li, Yushuai
    Guan, Peiyuan
    Li, Tianyi
    Yu, Jiong
    Taherkordi, Amir
    Jensen, Christian S.
    [J]. IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2024, 9 (02): : 3704 - 3720
  • [10] FedBoost: Communication-Efficient Algorithms for Federated Learning
    Hamer, Jenny
    Mohri, Mehryar
    Suresh, Ananda Theertha
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119