Asynchronous Federated Learning for Sensor Data with Concept Drift

被引:14
|
作者
Chen, Yujing [1 ]
Chai, Zheng [1 ]
Cheng, Yue [1 ]
Rangwala, Huzefa [1 ]
机构
[1] George Mason Univ, Dept Comp Sci, Fairfax, VA 22030 USA
关键词
federated learning; asynchronous learning; concept drift; communication-efficient; CLASSIFICATION;
D O I
10.1109/BigData52589.2021.9671924
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) involves multiple distributed devices jointly training a shared model without any of the participants having to reveal their local data to a centralized server. Most of previous FL approaches assume that data on devices are fixed and stationary during the training process. However, this assumption is unrealistic because these devices usually have varying sampling rates and different system configurations. In addition, the underlying distribution of the device data can change dynamically over time, which is known as concept drift. Concept drift makes the learning process complicated because of the inconsistency between existing and upcoming data. Traditional concept drift handling techniques such as chunk based and ensemble learning-based methods are not suitable in the federated learning frameworks due to the heterogeneity of local devices. We propose a novel approach, FedConD, to detect and deal with the concept drift on local devices and minimize the effect on the performance of models in asynchronous FL. The drift detection strategy is based on an adaptive mechanism which uses the historical performance of the local models. The drift adaptation is realized by adjusting the regularization parameter of objective function on each local device. Additionally, we design a communication strategy on the server side to select local updates in a prudent fashion and speed up model convergence. Experimental evaluations on three evolving data streams and two image datasets show that FedConD detects and handles concept drift, and also reduces the overall communication cost compared to other baseline methods.
引用
收藏
页码:4822 / 4831
页数:10
相关论文
共 50 条
  • [41] Federated Machine Learning: Concept and Applications
    Yang, Qiang
    Liu, Yang
    Chen, Tianjian
    Tong, Yongxin
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2019, 10 (02)
  • [42] Federated transfer learning: Concept and applications
    Saha, Sudipan
    Ahmad, Tahir
    INTELLIGENZA ARTIFICIALE, 2021, 15 (01) : 35 - 44
  • [43] A Multiscale Concept Drift Detection Method for Learning from Data Streams
    Wang, XueSong
    Kang, Qi
    Zhou, MengChu
    Yao, SiYa
    2018 IEEE 14TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2018, : 786 - 790
  • [44] Combining active learning with concept drift detection for data stream mining
    Krawczyk, Bartosz
    Pfahringer, Bernhard
    Wozniak, Michal
    2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 2239 - 2244
  • [45] An Augmented Learning Approach for Multiple Data Streams Under Concept Drift
    Wang, Kun
    Lu, Jie
    Liu, Anjin
    Zhang, Guangquan
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI 2023, PT I, 2024, 14471 : 391 - 402
  • [46] Incremental Learning of Bayesian Networks from Concept-Drift Data
    Yu, Haibo
    2019 IEEE 4TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA ANALYSIS (ICCCBDA), 2019, : 701 - 704
  • [47] Client Selection With Staleness Compensation in Asynchronous Federated Learning
    Zhu, Hongbin
    Kuang, Junqian
    Yang, Miao
    Qian, Hua
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2023, 72 (03) : 4124 - 4129
  • [48] An Efficient Asynchronous Federated Learning Protocol for Edge Devices
    Li, Qian
    Gao, Ziyi
    Sun, Yetao
    Wang, Yan
    Wang, Rui
    Zhu, Haiyan
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28798 - 28808
  • [49] AsyFed: Accelerated Federated Learning With Asynchronous Communication Mechanism
    Li, Zhixin
    Huang, Chunpu
    Gai, Keke
    Lu, Zhihui
    Wu, Jie
    Chen, Lulu
    Xu, Yangchuan
    Choo, Kim-Kwang Raymond
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (10): : 8670 - 8683
  • [50] Asynchronous Online Federated Learning With Reduced Communication Requirements
    Gauthier, Francois
    Gogineni, Vinay Chakravarthi
    Werner, Stefan
    Huang, Yih-Fang
    Kuh, Anthony
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (23) : 20761 - 20775