An Element-Wise Weights Aggregation Method for Federated Learning

被引:1
|
作者
Hu, Yi [1 ]
Ren, Hanchi [1 ]
Hu, Chen [1 ]
Deng, Jingjing [2 ]
Xie, Xianghua [1 ]
机构
[1] Swansea Univ, Dept Comp Sci, Swansea, W Glam, Wales
[2] Univ Durham, Dept Comp Sci, Durham, England
关键词
Federated Learning; Weights Aggregation; Adaptive Learning;
D O I
10.1109/ICDMW60847.2023.00031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) is a powerful Machine Learning (ML) paradigm that enables distributed clients to collaboratively learn a shared global model while keeping the data on the original device, thereby preserving privacy. A central challenge in FL is the effective aggregation of local model weights from disparate and potentially unbalanced participating clients. Existing methods often treat each client indiscriminately, applying a single proportion to the entire local model. However, it is empirically advantageous for each weight to be assigned a specific proportion. This paper introduces an innovative Element-Wise Weights Aggregation Method for Federated Learning (EWWAFL) aimed at optimizing learning performance and accelerating convergence speed. Unlike traditional FL approaches, EWWAFL aggregates local weights to the global model at the level of individual elements, thereby allowing each participating client to make element-wise contributions to the learning process. By taking into account the unique dataset characteristics of each client, EWWA-FL enhances the robustness of the global model to different datasets while also achieving rapid convergence. The method is flexible enough to employ various weighting strategies. Through comprehensive experiments, we demonstrate the advanced capabilities of EWWA-FL, showing significant improvements in both accuracy and convergence speed across a range of backbones and benchmarks.
引用
收藏
页码:188 / 196
页数:9
相关论文
共 50 条
  • [1] FedPSE: Personalized Sparsification with Element-wise Aggregation for Federated Learning
    Zheng, Longfei
    Liu, Yingting
    Xu, Xiaolong
    Chen, Chaochao
    Tang, Yuzhou
    Wang, Lei
    Hu, Xiaolong
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3514 - 3523
  • [2] An element-wise kernel learning framework
    Alavi, Fatemeh
    Hashemi, Sattar
    APPLIED INTELLIGENCE, 2023, 53 (08) : 9531 - 9547
  • [3] An element-wise kernel learning framework
    Fatemeh Alavi
    Sattar Hashemi
    Applied Intelligence, 2023, 53 : 9531 - 9547
  • [4] Deformable element-wise dynamic convolution
    Kim, Wonjik
    Tanaka, Masayuki
    Sasaki, Yoko
    Okutomi, Masatoshi
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (05)
  • [5] Linear Fusion with Element-Wise Knowledge
    Ajgl, Jiri
    Straka, Ondrej
    2022 25TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2022), 2022,
  • [6] Automated element-wise reasoning with sets
    Struth, G
    PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING AND FORMAL METHODS, 2004, : 320 - 329
  • [8] Least squares problems with element-wise weighting
    Rastello, M. L.
    Premoli, A.
    METROLOGIA, 2006, 43 (04) : S260 - S267
  • [9] Network Quantization with Element-wise Gradient Scaling
    Lee, Junghyup
    Kim, Dohyung
    Ham, Bumsub
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 6444 - 6453
  • [10] Fusing Non Element-wise Layers in DNNs
    Sridhar, Upasana
    Low, Tze Meng
    Schatz, Martin D.
    2021 IEEE HIGH PERFORMANCE EXTREME COMPUTING CONFERENCE (HPEC), 2021,