Communication Efficient Federated Learning With Heterogeneous Structured Client Models

被引:4
|
作者
Hu, Yao [1 ,2 ]
Sun, Xiaoyan [3 ]
Tian, Ye [4 ,5 ]
Song, Linqi [1 ,2 ]
Tan, Kay Chen [6 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Hong Kong 999077, Peoples R China
[2] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen 518057, Peoples R China
[3] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221000, Jiangsu, Peoples R China
[4] Anhui Univ, Inst Phys Sci, Informat Mat & Intelligent Sensing Lab Anhui Prov, Hefei 230601, Peoples R China
[5] Anhui Univ, Inst Informat Technol, Informat Mat & Intelligent Sensing Lab Anhui Prov, Hefei 230601, Peoples R China
[6] Hong Kong Polytech Univ, Dept Comp, Hong Kong 999077, Peoples R China
基金
中国国家自然科学基金;
关键词
Servers; Costs; Matrix decomposition; Training; Data models; Optimization; Data privacy; Federated learning; heterogeneous structured model; neural network; singular value decomposition; FACTORIZATION; SYSTEMS;
D O I
10.1109/TETCI.2022.3209345
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) has recently attracted much attention due to its superior performance in privacy protection when processing data from different terminals. However, homogeneous deep learning models are pervasively adopted without considering the difference between distinct data in various clients, resulting in low learning performance and high communication costs. This paper thus proposes a novel FL framework with heterogeneous structured client models for handling different data scales and investigates its superiority over canonical FL with homogeneous models. Additionally, singular value decomposition is adopted on the client models to reduce the amount of transmitted data, i.e., the communication costs. The aggregation mechanism with multiple models on the central server is then presented based on the heterogeneous characteristics of the uploaded parameters and models. The proposed framework is applied to four benchmark classification datasets and a trend following task on electromagnetic radiation intensity time series data. Experimental results demonstrate that the proposed method can effectively improve the accuracy of local learning models and significantly reduce communication costs.
引用
收藏
页码:753 / 767
页数:15
相关论文
共 50 条
  • [1] FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
    Chan, Yun Hin
    Ngai, Edith C. H.
    [J]. 2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 207 - 214
  • [2] Compressed Client Selection for Efficient Communication in Federated Learning
    Mohamed, Aissa Hadj
    Assumpcao, Nicolas R. G.
    Astudillo, Carlos A.
    de Souza, Allan M.
    Bittencourt, Luiz F.
    Villas, Leandro A.
    [J]. 2023 IEEE 20TH CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2023,
  • [3] Adaptive client selection with personalization for communication efficient Federated Learning
    de Souza, Allan M.
    Maciel, Filipe
    da Costa, Joahannes B. D.
    Bittencourt, Luiz F.
    Cerqueira, Eduardo
    Loureiro, Antonio A. F.
    Villas, Leandro A.
    [J]. AD HOC NETWORKS, 2024, 157
  • [4] Reducing communication in federated learning via efficient client sampling
    Ribero, Monica
    Vikalo, Haris
    [J]. PATTERN RECOGNITION, 2024, 148
  • [5] FedStar: Efficient Federated Learning on Heterogeneous Communication Networks
    Cao, Jing
    Wei, Ran
    Cao, Qianyue
    Zheng, Yongchun
    Zhu, Zongwei
    Ji, Cheng
    Zhou, Xuehai
    [J]. IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2024, 43 (06) : 1848 - 1861
  • [6] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    [J]. ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [7] Resilient and Communication Efficient Learning for Heterogeneous Federated Systems
    Zhu, Zhuangdi
    Hong, Junyuan
    Drew, Steve
    Zhou, Jiayu
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [8] Greedy Shapley Client Selection for Communication-Efficient Federated Learning
    Singhal, Pranava
    Pandey, Shashi Raj
    Popovski, Petar
    [J]. IEEE Networking Letters, 2024, 6 (02): : 134 - 138
  • [9] Communication Efficient Heterogeneous Federated Learning based on Model Similarity
    Li, Zhaojie
    Ohtsuki, Tomoaki
    Gui, Guan
    [J]. 2023 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC, 2023,
  • [10] Energy-efficient client selection in federated learning with heterogeneous data on edge
    Jianxin Zhao
    Yanhao Feng
    Xinyu Chang
    Chi Harold Liu
    [J]. Peer-to-Peer Networking and Applications, 2022, 15 : 1139 - 1151