Communication Cost Reduction with Partial Structure in Federated Learning

被引:13
|
作者
Kang, Dongseok [1 ]
Ahn, Chang Wook [1 ]
机构
[1] Gwangju Inst Sci & Technol, AI Grad Sch, 123 Cheomdangwagi Ro, Gwangju 61005, South Korea
基金
新加坡国家研究基金会;
关键词
federated learning; artificial intelligence; neural network; NEURAL-NETWORKS; DEEP; CHALLENGES;
D O I
10.3390/electronics10172081
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is a distributed learning algorithm designed to train a single server model on a server using different clients and their local data. To improve the performance of the server model, continuous communication with clients is required, and since the number of clients is very large, the algorithm must be designed in consideration of the cost required for communication. In this paper, we propose a method for distributing a model with a structure different from that of the server model, distributing a model suitable for clients with different data sizes, and training a server model using the reconstructed model trained by the client. In this way, the server model deploys only a subset of the sequential model, collects gradient updates, and selectively applies updates to the server model. This method of delivering the server model at a lower cost to clients who only need smaller models can reduce the communication cost of training server models compared to standard methods. An image classification model was designed to verify the effectiveness of the proposed method via three data distribution situations and two datasets, and it was confirmed that training was accomplished only with a cost 0.229 times smaller than the standard method.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] FedSCR: Structure-Based Communication Reduction for Federated Learning
    Wu, Xueyu
    Yao, Xin
    Wang, Cho-Li
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2021, 32 (07) : 1565 - 1577
  • [2] Communication overhead reduction in federated learning: a review
    Nariman, Goran Saman
    Hamarashid, Hozan Khalid
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2025, 19 (02) : 185 - 216
  • [3] Grey Wolf Optimizer for Reducing Communication Cost of Federated Learning
    Abasi, Ammar Kamal
    Aloqaily, Moayad
    Guizani, Mohsen
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1049 - 1054
  • [4] BAFL: Federated Learning with Base Ablation for Cost Effective Communication
    Kundalwal, Mayank Kumar
    Saraswat, Anurag
    Mishra, Ishan
    Mishra, Deepak
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1922 - 1928
  • [5] On the effectiveness of partial variance reduction in federated learning with heterogeneous data
    Li, Bo
    Schmidt, Mikkel N.
    Alstrom, Tommy S.
    Stich, Sebastian U.
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 3964 - 3973
  • [6] SmartIdx: Reducing Communication Cost in Federated Learning by Exploiting the CNNs Structures
    Wu, Donglei
    Zou, Xiangyu
    Zhang, Shuyu
    Jin, Haoyu
    Xia, Wen
    Fang, Binxing
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4254 - 4262
  • [7] Communication Size Reduction of Federated Learning based on Neural ODE Model
    Hoshino, Yuto
    Kawakami, Hiroki
    Matsutani, Hiroki
    2022 TENTH INTERNATIONAL SYMPOSIUM ON COMPUTING AND NETWORKING WORKSHOPS, CANDARW, 2022, : 55 - 61
  • [8] Sparse Communication for Federated Learning
    Thonglek, Kundjanasith
    Takahashi, Keichi
    Ichikawa, Kohei
    Nakasan, Chawanat
    Leelaprute, Pattara
    Iida, Hajimu
    6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 1 - 8
  • [9] Timely Communication in Federated Learning
    Buyukates, Baturalp
    Ulukus, Sennur
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS (IEEE INFOCOM WKSHPS 2021), 2021,
  • [10] Federated Learning with Partial Model Personalization
    Pillutla, Krishna
    Malik, Kshitiz
    Mohamed, Abdelrahman
    Rabbat, Michael
    Sanjabi, Maziar
    Xiao, Lin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,