ASCFL: Accurate and Speedy Semi-Supervised Clustering Federated Learning

被引:4
|
作者
He, Jingyi [1 ]
Gong, Biyao [1 ]
Yang, Jiadi [1 ]
Wang, Hai [1 ]
Xu, Pengfei [1 ]
Xing, Tianzhang [1 ,2 ]
机构
[1] Northwest Univ, Sch Informat Sci & Technol, Xian 710100, Peoples R China
[2] Northwest Univ, Internet Things Res Ctr, Xian 710100, Peoples R China
来源
TSINGHUA SCIENCE AND TECHNOLOGY | 2023年 / 28卷 / 05期
基金
中国国家自然科学基金;
关键词
federated learning; clustered federated learning; non-Independent Identically Distribution (non-IID) data; similarity indicator; client selection; semi-supervised learning;
D O I
10.26599/TST.2022.9010057
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The influence of non-Independent Identically Distribution (non-IID) data on Federated Learning (FL) has been a serious concern. Clustered Federated Learning (CFL) is an emerging approach for reducing the impact of non-IID data, which employs the client similarity calculated by relevant metrics for clustering. Unfortunately, the existing CFL methods only pursue a single accuracy improvement, but ignore the convergence rate. Additionlly, the designed client selection strategy will affect the clustering results. Finally, traditional semi-supervised learning changes the distribution of data on clients, resulting in higher local costs and undesirable performance. In this paper, we propose a novel CFL method named ASCFL, which selects clients to participate in training and can dynamically adjust the balance between accuracy and convergence speed with datasets consisting of labeled and unlabeled data. To deal with unlabeled data, the prediction labels strategy predicts labels by encoders. The client selection strategy is to improve accuracy and reduce overhead by selecting clients with higher losses participating in the current round. What is more, the similarity-based clustering strategy uses a new indicator to measure the similarity between clients. Experimental results show that ASCFL has certain advantages in model accuracy and convergence speed over the three state-of-the-art methods with two popular datasets.
引用
下载
收藏
页码:823 / 837
页数:15
相关论文
共 50 条
  • [21] Federated semi-supervised learning based on truncated Gaussian aggregation
    Suxia Zhu
    Yunmeng Wang
    Guanglu Sun
    Sun, Guanglu (sunguanglu@hrbust.edu.cn), 2025, 81 (01):
  • [22] Exploitation Maximization of Unlabeled Data for Federated Semi-Supervised Learning
    Chen, Siguang
    Shen, Jianhua
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, : 1 - 6
  • [23] Learning Bregman Distance Functions for Semi-Supervised Clustering
    Wu, Lei
    Hoi, Steven C. H.
    Jin, Rong
    Zhu, Jianke
    Yu, Nenghai
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2012, 24 (03) : 478 - 491
  • [24] Semi-Supervised clustering and Local Scale Learning algorithm
    Bchir, Ouiem
    Frigui, Hichem
    Ben Ismail, Mohamed Maher
    WORLD CONGRESS ON COMPUTER & INFORMATION TECHNOLOGY (WCCIT 2013), 2013,
  • [25] Adaptive and structured graph learning for semi-supervised clustering
    Chen, Long
    Zhong, Zhi
    INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (04)
  • [26] Active Learning of Constraints for Semi-supervised Text Clustering
    Huang, Ruizhang
    Lam, Wai
    Zhang, Zhigang
    PROCEEDINGS OF THE SEVENTH SIAM INTERNATIONAL CONFERENCE ON DATA MINING, 2007, : 113 - 124
  • [27] Scalable semi-supervised clustering by spectral kernel learning
    Baghshah, M. Soleymani
    Afsari, F.
    Shouraki, S. Bagheri
    Eslami, E.
    PATTERN RECOGNITION LETTERS, 2014, 45 : 161 - 171
  • [28] Clustering Network Traffic Using Semi-Supervised Learning
    Krajewska, Antonina
    Niewiadomska-Szynkiewicz, Ewa
    ELECTRONICS, 2024, 13 (14)
  • [29] Local Clustering with Mean Teacher for Semi-supervised learning
    Chen, Zexi
    Dutton, Benjamin
    Ramachandra, Bharathkumar
    Wu, Tianfu
    Vatsavai, Ranga Raju
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6243 - 6250
  • [30] Structured graph learning for clustering and semi-supervised classification
    Kang, Zhao
    Peng, Chong
    Cheng, Qiang
    Liu, Xinwang
    Peng, Xi
    Xu, Zenglin
    Tian, Ling
    PATTERN RECOGNITION, 2021, 110