SpecFL: An Efficient Speculative Federated Learning System for Tree-based Model Training

被引:1
|
作者
Zhang, Yuhui [1 ,2 ]
Zhao, Lutan [1 ,2 ]
Che, Cheng [1 ,2 ]
Wang, XiaoFeng [3 ]
Meng, Dan [1 ,2 ]
Hou, Rui [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Informat Engn, Key Lab Cyberspace Secur Def, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Sch Cyber Secur, Beijing, Peoples R China
[3] Indiana Univ Bloomington, Bloomington, IN USA
基金
中国国家自然科学基金;
关键词
Federated Learning; Tree-based Model; Speculative; Training; FAULT-DETECTION; PREDICTION;
D O I
10.1109/HPCA57654.2024.00068
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated tree-based models are popular in many real-world applications owing to their high accuracy and good interpretability. However, the classical synchronous method causes inefficient federated tree model training due to tree node dependencies. Inspired by speculative execution techniques in modern high-performance processors, this paper proposes SpecFL, a novel and efficient speculative federated learning system. Instead of simply waiting, SpecFL optimistically predicts the outcome of the prior tree node. By resolving tree node dependencies with a split point predictor, the training tasks of child tree nodes can be executed speculatively in advance via separate threads. This speculation enables cross-layer concurrent training, thus significantly reducing the waiting time. Furthermore, we propose a greedy speculation policy to exploit speculative training for deeper inter-layer concurrent training and an eager rollback mechanism for lossless model quality. We implement SpecFL and evaluate its efficiency in a real-world federated learning setting with six public datasets. The evaluation results demonstrate that SpecFL can be 2.08-3.33x and 2.14-3.44x faster than the stateof-the-art GBDT and RF implementations, respectively.
引用
收藏
页码:817 / 831
页数:15
相关论文
共 50 条
  • [1] Replica tree-based federated learning using limited data
    Ghilea, Ramona
    Rekik, Islem
    NEURAL NETWORKS, 2025, 186
  • [2] Privacy Preserving Vertical Federated Learning for Tree-based Models
    Wu, Yuncheng
    Cai, Shaofeng
    Xiao, Xiaokui
    Chen, Gang
    Ooi, Beng Chin
    PROCEEDINGS OF THE VLDB ENDOWMENT, 2020, 13 (11): : 2090 - 2103
  • [3] Efficient and Effective Tree-based and Neural Learning to Rank
    Bruch, Sebastian
    Lucchese, Claudio
    Nardini, Franco Maria
    FOUNDATIONS AND TRENDS IN INFORMATION RETRIEVAL, 2023, 17 (01): : 1 - 123
  • [4] FedWT: Federated Learning with Minimum Spanning Tree-based Weighted Tree Aggregation for UAV networks
    Kim, Geonhui
    Kim, Jiha
    Kim, Yongho
    Kim, Hwan
    Park, Hyunhee
    ICT EXPRESS, 2025, 11 (02): : 275 - 280
  • [5] Learning Tree-based Deep Model for Recommender Systems
    Zhu, Han
    Li, Xiang
    Zhang, Pengye
    Li, Guozheng
    He, Jie
    Li, Han
    Gai, Kun
    KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 1079 - 1088
  • [6] A Federated Learning Benchmark on Tabular Data: Comparing Tree-Based Models and Neural Networks
    Lindskog, William
    Prehofer, Christian
    2023 EIGHTH INTERNATIONAL CONFERENCE ON FOG AND MOBILE EDGE COMPUTING, FMEC, 2023, : 239 - 246
  • [7] An Efficient Multi-Model Training Algorithm for Federated Learning
    Li, Cong
    Li, Chunxi
    Zhao, Yongxiang
    Zhang, Baoxian
    Li, Cheng
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [8] HERMES: Hardware-Efficient Speculative Dataflow Architecture for Bonsai Merkle Tree-Based Memory Authentication
    Zou, Yu
    Awad, Amro
    Lin, Mingjie
    2021 IEEE INTERNATIONAL SYMPOSIUM ON HARDWARE ORIENTED SECURITY AND TRUST (HOST), 2021, : 203 - 213
  • [9] CyclicFL: Efficient Federated Learning with Cyclic Model Pre-Training
    Zhang, Pengyu
    Zhou, Yingbo
    Hu, Ming
    Wei, Xian
    Chen, Mingsong
    JOURNAL OF CIRCUITS SYSTEMS AND COMPUTERS, 2025,
  • [10] Evaluation of an Energy-Efficient Tree-Based Model of Fog Computing
    Oma, Ryuji
    Nakamura, Shigenari
    Duolikun, Dilawaer
    Enokido, Tomoya
    Takizawa, Makoto
    ADVANCES IN NETWORK-BASED INFORMATION SYSTEMS, NBIS-2018, 2019, 22 : 99 - 109