Federated Feature Concatenate Method for Heterogeneous Computing in Federated Learning

被引:2
|
作者
Chung, Wu -Chun [1 ]
Chang, Yung -Chin [1 ]
Hsu, Ching-Hsien [2 ,3 ]
Chang, Chih-Hung [4 ]
Hung, Che-Lun [4 ,5 ]
机构
[1] Chung Yuan Christian Univ, Dept Informat & Comp Engn, Taoyuan, Taiwan
[2] Asia Univ, Dept Comp Sci & Informat Engn, Taichung, Taiwan
[3] China Med Univ, China Med Univ Hosp, Dept Med Res, Taichung, Taiwan
[4] Providence Univ, Dept Comp Sci & Commun Engn, Taichung, Taiwan
[5] Natl Yang Ming Chiao Tung Univ, Inst Biomed Informat, Taipei, Taiwan
来源
CMC-COMPUTERS MATERIALS & CONTINUA | 2023年 / 75卷 / 01期
关键词
Federated learning; deep learning; artificial intelligence; heterogeneous computing; COMMUNICATION;
D O I
10.32604/cmc.2023.035720
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning is an emerging machine learning technique that enables clients to collaboratively train a deep learning model without uploading raw data to the aggregation server. Each client may be equipped with different computing resources for model training. The client equipped with a lower computing capability requires more time for model training, resulting in a prolonged training time in federated learning. Moreover, it may fail to train the entire model because of the out-of-memory issue. This study aims to tackle these problems and propose the federated feature concatenate (FedFC) method for federated learning considering heterogeneous clients. FedFC leverages the model splitting and feature concatenate for offloading a portion of the training loads from clients to the aggregation server. Each client in FedFC can collaboratively train a model with different cutting layers. Therefore, the specific features learned in the deeper layer of the server -side model are more identical for the data class classification. Accordingly, FedFC can reduce the computation loading for the resource-constrained client and accelerate the convergence time. The performance effectiveness is verified by considering different dataset scenarios, such as data and class imbalance for the participant clients in the experiments. The performance impacts of different cutting layers are evaluated during the model training. The experimental results show that the co-adapted features have a critical impact on the adequate classification of the deep learning model. Overall, FedFC not only shortens the convergence time, but also improves the best accuracy by up to 5.9% and 14.5% when compared to conventional federated learning and splitfed, respectively. In conclusion, the proposed approach is feasible and effective for heterogeneous clients in federated learning.
引用
收藏
页码:351 / 371
页数:21
相关论文
共 50 条
  • [21] Quantum computing meets federated learning
    Kaifeng Bu
    Science China Physics, Mechanics & Astronomy, 2022, 65
  • [22] FedHD: Federated Learning with Hyperdimensional Computing
    Zhao, Quanling
    Lee, Kai
    Liu, Jeffrey
    Huzaifa, Muhammad
    Yu, Xiaofan
    Rosing, Tajana
    PROCEEDINGS OF THE 2022 THE 28TH ANNUAL INTERNATIONAL CONFERENCE ON MOBILE COMPUTING AND NETWORKING, ACM MOBICOM 2022, 2022, : 791 - 793
  • [23] Quantum computing meets federated learning
    Bu, Kaifeng
    SCIENCE CHINA-PHYSICS MECHANICS & ASTRONOMY, 2022, 65 (01)
  • [24] Quantum computing meets federated learning
    Kaifeng Bu
    ScienceChina(Physics,Mechanics&Astronomy), 2022, (01) : 136 - 136
  • [25] Federated Learning for Edge Computing: A Survey
    Brecko, Alexander
    Kajati, Erik
    Koziorek, Jiri
    Zolotova, Iveta
    APPLIED SCIENCES-BASEL, 2022, 12 (18):
  • [26] Clustered Federated Learning in Heterogeneous Environment
    Yan, Yihan
    Tong, Xiaojun
    Wang, Shen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12796 - 12809
  • [27] Lazy Aggregation for Heterogeneous Federated Learning
    Xu, Gang
    Kong, De-Lun
    Chen, Xiu-Bo
    Liu, Xin
    APPLIED SCIENCES-BASEL, 2022, 12 (17):
  • [28] Learning to Generalize in Heterogeneous Federated Networks
    Chen, Cen
    Ye, Tiandi
    Wang, Li
    Gao, Ming
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 159 - 168
  • [29] Fair Federated Learning for Heterogeneous Data
    Kanaparthy, Samhita
    Padala, Manisha
    Damle, Sankarshan
    Gujar, Sujit
    PROCEEDINGS OF THE 5TH JOINT INTERNATIONAL CONFERENCE ON DATA SCIENCE & MANAGEMENT OF DATA, CODS COMAD 2022, 2022, : 298 - 299
  • [30] Federated learning based method for intelligent computing with privacy preserving in edge computing
    Liu Q.
    Xu X.
    Zhang X.
    Dou W.
    Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2021, 27 (09): : 2604 - 2610