Semi-HFL: semi-supervised federated learning for heterogeneous devices

被引:6
|
作者
Zhong, Zhengyi [1 ]
Wang, Ji [1 ]
Bao, Weidong [1 ]
Zhou, Jingxuan [1 ]
Zhu, Xiaomin [1 ]
Zhang, Xiongtao [1 ]
机构
[1] Natl Univ Def Technol, Coll Syst Engn, Deya Rd, Changsha 410000, Hunan, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; System heterogeneity; Semi-supervised learning; Multi-branch model;
D O I
10.1007/s40747-022-00894-4
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the vanilla federated learning (FL) framework, the central server distributes a globally unified model to each client and uses labeled samples for training. However, in most cases, clients are equipped with different devices and are exposed to a variety of situations. There are great differences between clients in storage, computing, communication, and other resources, which makes unified deep models used in traditional FL cannot fit clients' personalized resource conditions. Furthermore, a great deal of labeled data is needed in traditional FL, whereas data labeling requires a great investment of time and resources, which is hard to do for individual clients. As a result, clients only have a vast amount of unlabeled data, which goes against the federated learning needs. To address the aforementioned two issues, we propose Semi-HFL, a semi-supervised federated learning approach for heterogeneous devices, which divides a deep model into a series of small submodels by inserting early exit branches to meet the resource requirements of different devices. Furthermore, considering the availability of labeled data, Semi-HFL introduces semi-supervised techniques for training in the above heterogeneous learning process. Specifically, two training phases are included in the semi-supervised learning process, unsupervised learning on clients and supervised learning on the server, which makes full use of clients' unlabeled data. Through image classification, text classification, next-word prediction, and multi-task FL experiments based on five kinds of datasets, it is verified that compared with the traditional homogeneous learning method, Semi-HFL not only achieves higher accuracies but also significantly reduces the global resource overhead.
引用
下载
收藏
页码:1995 / 2017
页数:23
相关论文
共 50 条
  • [41] Non-IID always Bad? Semi-Supervised Heterogeneous Federated Learning with Local Knowledge Enhancement
    Zhang, Chao
    Wu, Fangzhao
    Yi, Jingwei
    Xu, Derong
    Yu, Yang
    Wang, Jindong
    Wang, Yidong
    Xu, Tong
    Xie, Xing
    Chen, Enhong
    PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 3257 - 3267
  • [42] Semi-supervised learning by disagreement
    Zhou, Zhi-Hua
    Li, Ming
    KNOWLEDGE AND INFORMATION SYSTEMS, 2010, 24 (03) : 415 - 439
  • [43] A survey on semi-supervised learning
    Jesper E. van Engelen
    Holger H. Hoos
    Machine Learning, 2020, 109 : 373 - 440
  • [44] Semi-supervised Sequence Learning
    Dai, Andrew M.
    Le, Quoc V.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 28 (NIPS 2015), 2015, 28
  • [45] Semi-Supervised Incremental Learning
    Bouchachia, Abdelhamid
    Prossegger, Markus
    Duman, Hakan
    2010 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE 2010), 2010,
  • [46] Semi-supervised learning by disagreement
    Zhi-Hua Zhou
    Ming Li
    Knowledge and Information Systems, 2010, 24 : 415 - 439
  • [47] Deep Semi-Supervised Learning
    Hailat, Zeyad
    Komarichev, Artem
    Chen, Xue-Wen
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 2154 - 2159
  • [48] Semi-Supervised Learning by Disagreement
    Zhou, Zhi-Hua
    2008 IEEE INTERNATIONAL CONFERENCE ON GRANULAR COMPUTING, VOLS 1 AND 2, 2008, : 93 - 93
  • [49] Reliable Semi-supervised Learning
    Shao, Junming
    Huang, Chen
    Yang, Qinli
    Luo, Guangchun
    2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 1197 - 1202
  • [50] Semi-supervised learning of heterogeneous data in remote sensing imagery
    Benedetto, J.
    Czaja, W.
    Dobrosotskaya, J.
    Doster, T.
    Duke, K.
    Gillis, D.
    INDEPENDENT COMPONENT ANALYSES, COMPRESSIVE SAMPLING, WAVELETS, NEURAL NET, BIOSYSTEMS, AND NANOENGINEERING X, 2012, 8401