FedDNA: Federated learning using dynamic node alignment

被引:2
|
作者
Wang, Shuwen [1 ]
Zhu, Xingquan [1 ]
机构
[1] Florida Atlantic Univ, Dept Elect Engn & Comp Sci, Boca Raton, FL 33431 USA
来源
PLOS ONE | 2023年 / 18卷 / 07期
基金
美国国家科学基金会;
关键词
D O I
10.1371/journal.pone.0288157
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Federated Learning (FL), as a new computing framework, has received significant attentions recently due to its advantageous in preserving data privacy in training models with superb performance. During FL learning, distributed sites first learn respective parameters. A central site will consolidate learned parameters, using average or other approaches, and disseminate new weights across all sites to carryout next round of learning. The distributed parameter learning and consolidation repeat in an iterative fashion until the algorithm converges or terminates. Many FL methods exist to aggregate weights from distributed sites, but most approaches use a static node alignment approach, where nodes of distributed networks are statically assigned, in advance, to match nodes and aggregate their weights. In reality, neural networks, especially dense networks, have nontransparent roles with respect to individual nodes. Combined with random nature of the networks, static node matching often does not result in best matching between nodes across sites. In this paper, we propose, FedDNA, a dynamic node alignment federated learning algorithm. Our theme is to find best matching nodes between different sites, and then aggregate weights of matching nodes for federated learning. For each node in a neural network, we represent its weight values as a vector, and use a distance function to find most similar nodes, i.e., nodes with the smallest distance from other sides. Because finding best matching across all sites are computationally expensive, we further design a minimum spanning tree based approach to ensure that a node from each site will have matched peers from other sites, such that the total pairwise distances across all sites are minimized. Experiments and comparisons demonstrate that FedDNA outperforms commonly used baseline, such as FedAvg, for federated learning.
引用
收藏
页数:25
相关论文
共 50 条
  • [41] Accelerating Convergence of Federated Learning in MEC With Dynamic Community
    Sun, Wen
    Zhao, Yong
    Ma, Wenqiang
    Guo, Bin
    Xu, Lexi
    Duong, Trung Q.
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (02) : 1769 - 1784
  • [42] A federated learning scheme meets dynamic differential privacy
    Guo, Shengnan
    Wang, Xibin
    Long, Shigong
    Liu, Hai
    Hai, Liu
    Sam, Toong Hai
    CAAI TRANSACTIONS ON INTELLIGENCE TECHNOLOGY, 2023, 8 (03) : 1087 - 1100
  • [43] Dynamic Personalized Federated Learning with Adaptive Differential Privacy
    Yang, Xiyuan
    Huang, Wenke
    Ye, Mang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [44] Using Federated Learning on Malware Classification
    Lin, Kuang-Yao
    Huang, Wei-Ren
    2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 585 - 589
  • [45] Personalized Federated Learning using Hypernetworks
    Shamsian, Aviv
    Navon, Aviv
    Fetaya, Ethan
    Chechik, Gal
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [46] DFL: Dynamic Federated Split Learning in Heterogeneous IoT
    Samikwa, Eric
    Di Maio, Antonio
    Braun, Torsten
    IEEE Transactions on Machine Learning in Communications and Networking, 2024, 2 : 733 - 752
  • [47] Probabilistic Node Selection for Federated Learning with Heterogeneous Data in Mobile Edge
    Wu, Hongda
    Wang, Ping
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 2453 - 2458
  • [48] A Vision For Hierarchical Federated Learning in Dynamic Service Chaining
    Bittar, Abdullah
    Huang, Changcheng
    2022 IEEE CONFERENCE ON NETWORK FUNCTION VIRTUALIZATION AND SOFTWARE DEFINED NETWORKS (IEEE NFV-SDN), 2022, : 103 - 107
  • [49] Dynamic Games in Federated Learning Training Service Market
    Zou, Yuze
    Feng, Shaohan
    Xu, Jing
    Gong, Shimin
    Niyato, Dusit
    Cheng, Wenqing
    2019 IEEE PACIFIC RIM CONFERENCE ON COMMUNICATIONS, COMPUTERS AND SIGNAL PROCESSING (PACRIM), 2019,
  • [50] Toward Node Liability in Federated Learning: Computational Cost and Network Overhead
    Malandrino, Francesco
    Chiasserini, Carla Fabiana
    IEEE COMMUNICATIONS MAGAZINE, 2021, 59 (09) : 72 - 77