Enabling efficient and low-effort decentralized federated learning with the EdgeFL framework

被引:0
|
作者
Zhang, Hongyi [1 ]
Bosch, Jan [1 ]
Olsson, Helena Holmstrom [2 ]
机构
[1] Chalmers Univ Technol, Gothenburg, Sweden
[2] Malmo Univ, Malmo, Sweden
基金
瑞典研究理事会;
关键词
Federated learning; Machine learning; Software engineering; Decentralized architecture; Information privacy; DATA PRIVACY;
D O I
10.1016/j.infsof.2024.107600
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Context: Federated Learning (FL) has gained prominence as a solution for preserving data privacy in machine learning applications. However, existing FL frameworks pose challenges for software engineers due to implementation complexity, limited customization options, and scalability issues. These limitations prevent the practical deployment of FL, especially in dynamic and resource-constrained edge environments, preventing its widespread adoption. Objective: To address these challenges, we propose EdgeFL, an efficient and low-effort FL framework designed to overcome centralized aggregation, implementation complexity and scalability limitations. EdgeFL applies a decentralized architecture that eliminates reliance on a central server by enabling direct model training and aggregation among edge nodes, which enhances fault tolerance and adaptability to diverse edge environments. Methods: We conducted experiments and a case study to demonstrate the effectiveness of EdgeFL. Our approach focuses on reducing weight update latency and facilitating faster model evolution on edge devices. Results: Our findings indicate that EdgeFL outperforms existing FL frameworks in terms of learning efficiency and performance. By enabling quicker model evolution on edge devices, EdgeFL enhances overall efficiency and responsiveness to changing data patterns. Conclusion: EdgeFL offers a solution for software engineers and companies seeking the benefits of FL, while effectively overcoming the challenges and privacy concerns associated with traditional FL frameworks. Its decentralized approach, simplified implementation, combined with enhanced customization and fault tolerance, make it suitable for diverse applications and industries.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] A Blockchain-Based Decentralized Federated Learning Framework with Committee Consensus
    Li, Yuzheng
    Chen, Chuan
    Liu, Nan
    Huang, Huawei
    Zheng, Zibin
    Yan, Qiang
    IEEE NETWORK, 2021, 35 (01): : 234 - 241
  • [42] An efficient federated learning framework for graph learning in hyperbolic space
    Du, Haizhou
    Liu, Conghao
    Liu, Haotian
    Ding, Xiaoyu
    Huo, Huan
    KNOWLEDGE-BASED SYSTEMS, 2024, 289
  • [43] Decentralized federated meta-learning framework for few-shot multitask learning
    Li, Xiaoli
    Li, Yuzheng
    Wang, Jining
    Chen, Chuan
    Yang, Liu
    Zheng, Zibin
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (11) : 8490 - 8522
  • [44] Efficient Communication for Decentralized Federated Learning: An Energy Disaggregation Case Study
    Zhang, Yusen
    Gao, Feng
    Zhou, Kangjia
    IEEE 15TH INTERNATIONAL SYMPOSIUM ON POWER ELECTRONICS FOR DISTRIBUTED GENERATION SYSTEMS, PEDG 2024, 2024,
  • [45] Secure and Efficient Decentralized Analytics on Digital Twins Using Federated Learning
    Uprety, Aashma
    Rawat, Danda B.
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 4716 - 4721
  • [46] AutoFL: Enabling Heterogeneity-Aware Energy Efficient Federated Learning
    Kim, Young Geun
    Wu, Carole-Jean
    PROCEEDINGS OF 54TH ANNUAL IEEE/ACM INTERNATIONAL SYMPOSIUM ON MICROARCHITECTURE, MICRO 2021, 2021, : 183 - 198
  • [47] Communication Efficient Federated Learning Framework with Local Momentum
    Xie, Renyou
    Zhou, Xiaojun
    2022 15TH INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTION (HSI), 2022,
  • [48] An asynchronous federated learning focusing on updated models for decentralized systems with a practical framework
    Kanamori, Yusuke
    Yamasaki, Yusuke
    Hosoai, Shintaro
    Nakamura, Hiroshi
    Takase, Hideki
    2023 IEEE 47TH ANNUAL COMPUTERS, SOFTWARE, AND APPLICATIONS CONFERENCE, COMPSAC, 2023, : 1147 - 1154
  • [49] Decentralized Federated Learning Framework Based on Proof-of-contribution Consensus Mechanism
    Qiao S.-J.
    Lin Y.-F.
    Han N.
    Yang G.-P.
    Li H.
    Yuan G.
    Mao R.
    Yuan C.-A.
    Gutierrez L.A.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (03): : 1148 - 1167
  • [50] Decentralized Edge Intelligence: A Dynamic Resource Allocation Framework for Hierarchical Federated Learning
    Lim, Wei Yang Bryan
    Ng, Jer Shyuan
    Xiong, Zehui
    Jin, Jiangming
    Zhang, Yang
    Niyato, Dusit
    Leung, Cyril
    Miao, Chunyan
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (03) : 536 - 550