Adaptive Federated Dropout: Improving Communication Efficiency and Generalization for Federated Learning

被引:16
|
作者
Bouacida, Nader [1 ]
Hou, Jiahui [1 ]
Zang, Hui [2 ]
Liu, Xin [1 ]
机构
[1] Univ Calif Davis, Davis, CA 95616 USA
[2] Google, Mountain View, CA 94043 USA
关键词
federated learning; compression; communication efficiency; generalization; convergence time;
D O I
10.1109/INFOCOMWKSHPS51825.2021.9484526
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
To exploit the wealth of data generated and located at distributed entities such as mobile phones, a revolutionary decentralized machine learning setting, known as federated learning, enables multiple clients to collaboratively learn a machine learning model while keeping all their data on-device. However, the scale and decentralization of federated learning present new challenges. Communication between the clients and the server is considered a main bottleneck in the convergence time of federated learning because of a very large number of model's weights that need to be exchanged in each training round. In this paper, we propose and study Adaptive Federated Dropout (AFD), a novel technique to reduce the communication costs associated with federated learning. It optimizes both server-client communications and computation costs by allowing clients to train locally on a selected subset of the global model. We empirically show that this strategy, combined with existing compression methods, collectively provides up to 57x reduction in convergence time. It also outperforms the state-of-the-art solutions for communication efficiency. Furthermore, it improves model generalization by up to 1.7%.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Federated Pruning: Improving Neural Network Efficiency with Federated Learning
    Lin, Rongmei
    Xiao, Yonghui
    Yang, Tien-Ju
    Zhao, Ding
    Xiong, Li
    Motta, Giovanni
    Beaufays, Francoise
    INTERSPEECH 2022, 2022, : 1701 - 1705
  • [2] Improving Global Generalization and Local Personalization for Federated Learning
    Meng, Lei
    Qi, Zhuang
    Wu, Lei
    Du, Xiaoyu
    Li, Zhaochuan
    Cui, Lizhen
    Meng, Xiangxu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [3] Improving Generalization in Federated Learning by Seeking Flat Minima
    Caldarola, Debora
    Caputo, Barbara
    Ciccone, Marco
    COMPUTER VISION, ECCV 2022, PT XXIII, 2022, 13683 : 654 - 672
  • [4] Adaptive client and communication optimizations in Federated Learning
    Wu, Jiagao
    Wang, Yu
    Shen, Zhangchi
    Liu, Linfeng
    INFORMATION SYSTEMS, 2023, 116
  • [5] Computation and Communication Efficient Adaptive Federated Optimization of Federated Learning for Internet of Things
    Chen, Zunming
    Cui, Hongyan
    Wu, Ensen
    Yu, Xi
    ELECTRONICS, 2023, 12 (16)
  • [6] Evaluating the Communication Efficiency in Federated Learning Algorithms
    Asad, Muhammad
    Moustafa, Ahmed
    Ito, Takayuki
    Aslam, Muhammad
    PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 552 - 557
  • [7] Communication-Efficient Adaptive Federated Learning
    Wang, Yujia
    Lin, Lu
    Chen, Jinghui
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [8] Communication and computation efficiency in Federated Learning: A survey
    Almanifi, Omair Rashed Abdulwareth
    Chow, Chee-Onn
    Tham, Mau-Luen
    Chuah, Joon Huang
    Kanesan, Jeevan
    INTERNET OF THINGS, 2023, 22
  • [9] Improving Generalization and Personalization in Model-Heterogeneous Federated Learning
    Zhang, Xiongtao
    Wang, Ji
    Bao, Weidong
    Zhang, Yaohong
    Zhu, Xiaomin
    Peng, Hao
    Zhao, Xiang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [10] On improving the regional transportation efficiency based on federated learning
    Su, Zhongqing
    Li, Congduan
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2023, 360 (07): : 4973 - 5000