FLAC: Federated Learning with Autoencoder Compression and Convergence Guarantee

被引:10
|
作者
Beitollahi, Mahdi [1 ]
Lu, Ning [1 ]
机构
[1] Queens Univ, Dept Elect & Comp Engn, Kingston, ON K7L 3N6, Canada
关键词
D O I
10.1109/GLOBECOM48099.2022.10000743
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) is considered the key approach for privacy-preserving, distributed machine learning (ML) systems. However, due to the transmission of large ML models from users to the server in each iteration of FL, communication on resource-constrained networks is currently a fundamental bottleneck in FL, restricting the ML model complexity and user participation. One of the notable trends to reduce the communication cost of FL systems is gradient compression, in which techniques in the form of sparsification or quantization are utilized. However, these methods are pre-fixed and do not capture the redundant, correlated information across parameters of the ML models, user devices' data, and iterations of FL. Further, these methods do not fully take advantage of the error-correcting capability of the FL process. In this paper, we propose the Federated Learning with Autoencoder Compression (FLAC) approach that utilizes the redundant information and errorcorrecting capability of FL to compress user devices' models for uplink transmission. FLAC trains an autoencoder to encode and decode users' models at the server in the Training State, and then, sends the autoencoder to user devices for compressing local models for future iterations during the Compression State. To guarantee the convergence of the FL, FLAC dynamically controls the autoencoder error by switching between the Training State and Compression State to adjust its autoencoder and its compression rate based on the error tolerance of the FL system. We theoretically prove that FLAC converges for FL systems with strongly convex ML models and non-i.i.d. data distribution. Our extensive experimental results(1)over three datasets with different network architectures show that FLAC can achieve compression rates ranging from 83x to 875x while staying near 7 percent of the accuracy of the non-compressed FL systems.
引用
收藏
页码:4589 / 4594
页数:6
相关论文
共 50 条
  • [21] Compression with Exact Error Distribution for Federated Learning
    Hegazy, Mahmoud
    Leluc, Remi
    Li, Cheuk Ting
    Dieuleveut, Aymeric
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [22] Federated Multidomain Learning With Graph Ensemble Autoencoder GMM for Emotion Recognition
    Zhang, Chunjiong
    Li, Mingyong
    Wu, Di
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (07) : 7631 - 7641
  • [23] Autoencoder-Enhanced Federated Learning with Reduced Overhead and Lower Latency
    Hsieh, Chi-Kai
    Chien, Feng-Tsun
    Chang, Min-Kuan
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 2118 - 2123
  • [24] A Hybrid Federated Learning Architecture With Online Learning and Model Compression
    Odeyomi, Olusola T.
    Ajibuwa, Opeyemi
    Roy, Kaushik
    IEEE ACCESS, 2024, 12 : 191046 - 191058
  • [25] A flexible transfer learning framework for Bayesian optimization with convergence guarantee
    Joy, Tinu Theckel
    Rana, Santu
    Gupta, Sunil
    Venkatesh, Svetha
    EXPERT SYSTEMS WITH APPLICATIONS, 2019, 115 : 656 - 672
  • [26] The Role of Communication Time in the Convergence of Federated Edge Learning
    Zhou, Yipeng
    Fu, Yao
    Luo, Zhenxiao
    Hu, Miao
    Wu, Di
    Sheng, Quan Z.
    Yu, Shui
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (03) : 3241 - 3254
  • [27] Accelerating Convergence of Federated Learning in MEC With Dynamic Community
    Sun, Wen
    Zhao, Yong
    Ma, Wenqiang
    Guo, Bin
    Xu, Lexi
    Duong, Trung Q.
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (02) : 1769 - 1784
  • [28] Device Scheduling with Fast Convergence for Wireless Federated Learning
    Shi, Wenqi
    Zhou, Sheng
    Niu, Zhisheng
    ICC 2020 - 2020 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2020,
  • [29] On the Convergence of Hierarchical Federated Learning with Partial Worker Participation
    Jiang, Xiaohan
    Zhu, Hongbin
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2024, 244 : 1797 - 1824
  • [30] Convergence Analysis for Wireless Federated Learning with Gradient Recycling
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    2023 INTERNATIONAL WIRELESS COMMUNICATIONS AND MOBILE COMPUTING, IWCMC, 2023, : 1232 - 1237