Federated Self-Supervised Learning Based on Prototypes Clustering Contrastive Learning for Internet of Vehicles Applications

被引:0
|
作者
Dai, Cheng [1 ]
Wei, Shuai [1 ]
Dai, Shengxin [1 ]
Garg, Sahil [2 ,3 ]
Kaddoum, Georges [2 ,4 ]
Shamim Hossain, M. [5 ]
机构
[1] Sichuan University, School of Computer Science, Chengdu,610042, China
[2] École de Technologie Supérieure, Electrical Engineering Department, Montreal,QC,H3C 1K3, Canada
[3] Chitkara University Institute of Engineering and Technology, Chitkara University, Centre for Research Impact and Outcome, Rajpura,140401, India
[4] Lebanese American University, Artificial Intelligence and Cyber Systems Research Center, Beirut,03797751, Lebanon
[5] King Saud University, College of Computer and Information Sciences, Department of Software Engineering, Riyadh,12372, Saudi Arabia
基金
中国国家自然科学基金;
关键词
Adversarial machine learning - Federated learning - Self-supervised learning - Supervised learning;
D O I
10.1109/JIOT.2024.3453336
中图分类号
学科分类号
摘要
Federated learning (FL) is a novel paradigm for distribute edge intelligence for the Internet-of-Vehicles (IoV) application, which can enable superior performance in model training without the need to share local data. However, in the actual architecture of FL, the existence of nonindependent and identically distributed (non-IID) data at the edge device, along with the involvement of randomly participating distributed nodes, can result in model bias and a subsequent decrease in overall performance. To solve this problem, a new federated self-supervised learning method based on prototypes clustering contrastive learning (FedPCC) is proposed, which can effectively addresses the issue of asynchronous edge training and global model bias by introducing an unsupervised prototypes layer. The prototypes layer maps edge features to a global space and performs clustering, facilitating the new aggregation method of global prototypes on the server. Then, models from other components are aggregated based on data weight. Besides that, during the parameter deployment phase, we replace the prototype layer to acquire global knowledge, while employing momentum updates to preserve the local knowledge of the other components. Finally, to assess the efficacy of our proposed approach, we carried out comprehensive experiments across the various data sets. The findings show that our method gains state-of-the-art performance, which also validates its effectiveness. © 2024 IEEE.
引用
收藏
页码:4692 / 4700
相关论文
共 50 条
  • [1] Memory Bank Clustering for Self-supervised Contrastive Learning
    Hao, Yiqing
    An, Gaoyun
    Ruan, Qiuqi
    IMAGE AND GRAPHICS TECHNOLOGIES AND APPLICATIONS, IGTA 2021, 2021, 1480 : 132 - 144
  • [2] Federated Graph Anomaly Detection via Contrastive Self-Supervised Learning
    Kong, Xiangjie
    Zhang, Wenyi
    Wang, Hui
    Hou, Mingliang
    Chen, Xin
    Yan, Xiaoran
    Das, Sajal K.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 14
  • [3] Adversarial Self-Supervised Contrastive Learning
    Kim, Minseon
    Tack, Jihoon
    Hwang, Sung Ju
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020), 2020, 33
  • [4] A Survey on Contrastive Self-Supervised Learning
    Jaiswal, Ashish
    Babu, Ashwin Ramesh
    Zadeh, Mohammad Zaki
    Banerjee, Debapriya
    Makedon, Fillia
    TECHNOLOGIES, 2021, 9 (01)
  • [5] Self-Supervised Learning: Generative or Contrastive
    Liu, Xiao
    Zhang, Fanjin
    Hou, Zhenyu
    Mian, Li
    Wang, Zhaoyu
    Zhang, Jing
    Tang, Jie
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (01) : 857 - 876
  • [6] Self-supervised Variational Contrastive Learning with Applications to Face Understanding
    Yavuz, Mehmet Can
    Yanikoglu, Berrin
    2024 IEEE 18TH INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION, FG 2024, 2024,
  • [7] A Simple and Effective Usage of Self-supervised Contrastive Learning for Text Clustering
    Shi, Haoxiang
    Wang, Cen
    Sakai, Tetsuya
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 315 - 320
  • [8] ScCCL: Single-Cell Data Clustering Based on Self-Supervised Contrastive Learning
    Du, Linlin
    Han, Rui
    Liu, Bo
    Wang, Yadong
    Li, Junyi
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2023, 20 (03) : 2233 - 2241
  • [9] A comprehensive perspective of contrastive self-supervised learning
    Songcan CHEN
    Chuanxing GENG
    Frontiers of Computer Science, 2021, (04) : 102 - 104
  • [10] On Compositions of Transformations in Contrastive Self-Supervised Learning
    Patrick, Mandela
    Asano, Yuki M.
    Kuznetsova, Polina
    Fong, Ruth
    Henriques, Joao F.
    Zweig, Geoffrey
    Vedaldi, Andrea
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 9557 - 9567