Communication Efficient Federated Learning With Heterogeneous Structured Client Models

被引:7
|
作者
Hu, Yao [1 ,2 ]
Sun, Xiaoyan [3 ]
Tian, Ye [4 ,5 ]
Song, Linqi [1 ,2 ]
Tan, Kay Chen [6 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Hong Kong 999077, Peoples R China
[2] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen 518057, Peoples R China
[3] China Univ Min & Technol, Sch Informat & Control Engn, Xuzhou 221000, Jiangsu, Peoples R China
[4] Anhui Univ, Inst Phys Sci, Informat Mat & Intelligent Sensing Lab Anhui Prov, Hefei 230601, Peoples R China
[5] Anhui Univ, Inst Informat Technol, Informat Mat & Intelligent Sensing Lab Anhui Prov, Hefei 230601, Peoples R China
[6] Hong Kong Polytech Univ, Dept Comp, Hong Kong 999077, Peoples R China
基金
中国国家自然科学基金;
关键词
Servers; Costs; Matrix decomposition; Training; Data models; Optimization; Data privacy; Federated learning; heterogeneous structured model; neural network; singular value decomposition; FACTORIZATION; SYSTEMS;
D O I
10.1109/TETCI.2022.3209345
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) has recently attracted much attention due to its superior performance in privacy protection when processing data from different terminals. However, homogeneous deep learning models are pervasively adopted without considering the difference between distinct data in various clients, resulting in low learning performance and high communication costs. This paper thus proposes a novel FL framework with heterogeneous structured client models for handling different data scales and investigates its superiority over canonical FL with homogeneous models. Additionally, singular value decomposition is adopted on the client models to reduce the amount of transmitted data, i.e., the communication costs. The aggregation mechanism with multiple models on the central server is then presented based on the heterogeneous characteristics of the uploaded parameters and models. The proposed framework is applied to four benchmark classification datasets and a trend following task on electromagnetic radiation intensity time series data. Experimental results demonstrate that the proposed method can effectively improve the accuracy of local learning models and significantly reduce communication costs.
引用
收藏
页码:753 / 767
页数:15
相关论文
共 50 条
  • [1] FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
    Chan, Yun Hin
    Ngai, Edith C. H.
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 207 - 214
  • [2] Efficient Client Sampling with Compression in Heterogeneous Federated Learning
    Marnissi, Ouiame
    El Hammouti, Hajar
    Bergou, El Houcine
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS, INFOCOM WKSHPS 2024, 2024,
  • [3] Compressed Client Selection for Efficient Communication in Federated Learning
    Mohamed, Aissa Hadj
    Assumpcao, Nicolas R. G.
    Astudillo, Carlos A.
    de Souza, Allan M.
    Bittencourt, Luiz F.
    Villas, Leandro A.
    2023 IEEE 20TH CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2023,
  • [4] Adaptive client selection with personalization for communication efficient Federated Learning
    de Souza, Allan M.
    Maciel, Filipe
    da Costa, Joahannes B. D.
    Bittencourt, Luiz F.
    Cerqueira, Eduardo
    Loureiro, Antonio A. F.
    Villas, Leandro A.
    AD HOC NETWORKS, 2024, 157
  • [5] Communication-Efficient Federated Learning With Data and Client Heterogeneity
    Zakerinia, Hossein
    Talaei, Shayan
    Nadiradze, Giorgi
    Alistarh, Dan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [6] Reducing communication in federated learning via efficient client sampling
    Ribero, Monica
    Vikalo, Haris
    PATTERN RECOGNITION, 2024, 148
  • [7] Communication-Efficient Federated Learning With Adaptive Aggregation for Heterogeneous Client-Edge-Cloud Network
    Luo, Long
    Zhang, Chi
    Yu, Hongfang
    Sun, Gang
    Luo, Shouxi
    Dustdar, Schahram
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (06) : 3241 - 3255
  • [8] FedStar: Efficient Federated Learning on Heterogeneous Communication Networks
    Cao, Jing
    Wei, Ran
    Cao, Qianyue
    Zheng, Yongchun
    Zhu, Zongwei
    Ji, Cheng
    Zhou, Xuehai
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2024, 43 (06) : 1848 - 1861
  • [9] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [10] Resilient and Communication Efficient Learning for Heterogeneous Federated Systems
    Zhu, Zhuangdi
    Hong, Junyuan
    Drew, Steve
    Zhou, Jiayu
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,