ClassTer: Mobile Shift-Robust Personalized Federated Learning via Class-Wise Clustering

被引:0
|
作者
Li, Xiaochen [1 ]
Liu, Sicong [1 ]
Zhou, Zimu [2 ]
Xu, Yuan [1 ]
Guo, Bin [1 ]
Yu, Zhiwen [2 ,3 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
[2] City Univ Hong Kong, Dept Data Sci, Hong Kong 999077, Peoples R China
[3] Harbin Engn Univ, Sch Comp Sci, Harbin 150001, Peoples R China
基金
中国国家自然科学基金;
关键词
Data models; Training; Adaptation models; Mobile handsets; Federated learning; Computational modeling; Convergence; Accuracy; Servers; Mobile applications; Asynchronous mobile devices; personalized federated learning; shift-robust;
D O I
10.1109/TMC.2024.3487294
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The rise of mobile devices with abundant sensor data and computing power has driven the trend of federated learning (FL) on them. Personalized FL (PFL) aims to train tailored models for each device, addressing data heterogeneity from diverse user behaviors and preferences. However, due to dynamic mobile environments, PFL faces challenges in test-time data shifts, i.e., variations between training and testing. While this issue is well studied in generic deep learning through model generalization or adaptation, this issue remains less explored in PFL, where models often overfit local data. To address this, we introduce ${\sf ClassTer}$ClassTer, a shift-robust PFL framework. We observe that class-wise clustering of clients in cluster-based PFL (CFL) can avoid class-specific biases by decoupling the training of classes. Thus, we propose a paradigm shift from traditional client-wise clustering to class-wise clustering, which allows effective aggregation of cluster models into a generalized one via knowledge distillation. Additionally, we extend ClassTer to asynchronous mobile clients to optimize wall clock time by leveraging critical learning periods and both intra- and inter-device scheduling. Experiments show that compared to status quo approaches, ${\sf ClassTer}$ClassTer achieves a reduction of up to 91% in convergence time, and an improvement of up to 50.45% in accuracy.
引用
收藏
页码:2014 / 2028
页数:15
相关论文
共 15 条
  • [1] Class-Wise Denoising for Robust Learning Under Label Noise
    Gong, Chen
    Ding, Yongliang
    Han, Bo
    Niu, Gang
    Yang, Jian
    You, Jane
    Tao, Dacheng
    Sugiyama, Masashi
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (03) : 2835 - 2848
  • [2] Communication Efficient Personalized Federated Learning via Hierarchical Clustering and Layer-wise Aggregation
    Shuang, Mingchang
    Zhang, Zhe
    Zhao, Yanchao
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 175 - 182
  • [3] Class-Wise Adaptive Self Distillation for Federated Learning on Non-IID Data
    He, Yuting
    Chen, Yiqiang
    Yang, Xiaodong
    Zhang, Yingwei
    Zeng, Bixiao
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 12967 - 12968
  • [4] DM-PFL: Hitchhiking Generic Federated Learning for Efficient Shift-Robust Personalization
    Zhang, Wenhao
    Zhou, Zimu
    Wang, Yansheng
    Tong, Yongxin
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 3396 - 3408
  • [5] Layer-Wise Personalized Federated Learning for Mobile Traffic Prediction
    Lee, Seungyeol
    Sung, Jihoon
    Shin, Myung-Ki
    IEEE ACCESS, 2024, 12 : 53126 - 53140
  • [6] Personalized Federated Learning with Robust Clustering Against Model Poisoning
    Ma, Jie
    Xie, Ming
    Long, Guodong
    ADVANCED DATA MINING AND APPLICATIONS, ADMA 2022, PT II, 2022, 13726 : 238 - 252
  • [7] Personalized Federated Learning with Layer-Wise Feature Transformation via Meta-Learning
    Tu, Jingke
    Huang, Jiaming
    Yang, Lei
    Lin, Wanyu
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (04)
  • [8] Domain-wise knowledge decoupling for personalized federated learning via Radon transform
    Lu, Zihao
    Wang, Junli
    Jiang, Changjun
    Neurocomputing, 2025, 635
  • [9] Robust Multi-model Personalized Federated Learning via Model Distillation
    Muhammad, Adil
    Lin, Kai
    Gao, Jian
    Chen, Bincai
    ALGORITHMS AND ARCHITECTURES FOR PARALLEL PROCESSING, ICA3PP 2021, PT III, 2022, 13157 : 432 - 446
  • [10] Hierarchical Clustering-based Personalized Federated Learning for Robust and Fair Human Activity Recognition
    Li, Youpeng
    Wang, Xuyu
    An, Lingling
    PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWUT, 2023, 7 (01):