FedKG: A Knowledge Distillation-Based Federated Graph Method for Social Bot Detection

被引:1
|
作者
Wang, Xiujuan [1 ]
Chen, Kangmiao [1 ]
Wang, Keke [1 ]
Wang, Zhengxiang [1 ]
Zheng, Kangfeng [2 ]
Zhang, Jiayue [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[2] Beijing Univ Posts & Telecommun, Sch Cyberspace Secur, Beijing 100876, Peoples R China
基金
北京市自然科学基金;
关键词
social bot detection; federated learning; knowledge distillation; graph neural network;
D O I
10.3390/s24113481
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Malicious social bots pose a serious threat to social network security by spreading false information and guiding bad opinions in social networks. The singularity and scarcity of single organization data and the high cost of labeling social bots have given rise to the construction of federated models that combine federated learning with social bot detection. In this paper, we first combine the federated learning framework with the Relational Graph Convolutional Neural Network (RGCN) model to achieve federated social bot detection. A class-level cross entropy loss function is applied in the local model training to mitigate the effects of the class imbalance problem in local data. To address the data heterogeneity issue from multiple participants, we optimize the classical federated learning algorithm by applying knowledge distillation methods. Specifically, we adjust the client-side and server-side models separately: training a global generator to generate pseudo-samples based on the local data distribution knowledge to correct the optimization direction of client-side classification models, and integrating client-side classification models' knowledge on the server side to guide the training of the global classification model. We conduct extensive experiments on widely used datasets, and the results demonstrate the effectiveness of our approach in social bot detection in heterogeneous data scenarios. Compared to baseline methods, our approach achieves a nearly 3-10% improvement in detection accuracy when the data heterogeneity is larger. Additionally, our method achieves the specified accuracy with minimal communication rounds.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] An Efficient Knowledge Distillation-Based Detection Method for Infrared Small Targets
    Tang, Wenjuan
    Dai, Qun
    Hao, Fan
    REMOTE SENSING, 2024, 16 (17)
  • [2] FedMEKT: Distillation-based embedding knowledge transfer for multimodal federated learning
    Le, Huy Q.
    Nguyen, Minh N. H.
    Thwal, Chu Myaet
    Qiao, Yu
    Zhang, Chaoning
    Hong, Choong Seon
    NEURAL NETWORKS, 2025, 183
  • [3] Knowledge Distillation-Based GPS Spoofing Detection for Small UAV
    Ren, Yingying
    Restivo, Ryan D.
    Tan, Wenkai
    Wang, Jian
    Liu, Yongxin
    Jiang, Bin
    Wang, Huihui
    Song, Houbing
    FUTURE INTERNET, 2023, 15 (12)
  • [4] Effective Intrusion Detection in Heterogeneous Internet-of-Things Networks via Ensemble Knowledge Distillation-based Federated Learning
    Shen, Jiyuan
    Yang, Wenzhuo
    Chu, Zhaowei
    Fan, Jiani
    Niyato, Dusit
    Lam, Kwok-Yan
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 2034 - 2039
  • [5] GFD-SSL: generative federated knowledge distillation-based semi-supervised learning
    Karami, Ali
    Ramezani, Reza
    Baraani Dastjerdi, Ahmad
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (12) : 5509 - 5529
  • [6] Distillation-based fabric anomaly detection
    Thomine, Simon
    Snoussi, Hichem
    TEXTILE RESEARCH JOURNAL, 2024, 94 (5-6) : 552 - 565
  • [7] FedGKD: Federated Graph Knowledge Distillation for privacy-preserving rumor detection
    Zheng, Peng
    Dou, Yong
    Yan, Yeqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [8] Facial landmark points detection using knowledge distillation-based neural networks
    Fard, Ali Pourramezan
    Mahoor, Mohammad H.
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2022, 215
  • [9] Group-Based Federated Knowledge Distillation Intrusion Detection
    Gao, Tiaokang
    Jin, Xiaoning
    INTERNATIONAL JOURNAL OF SOFTWARE ENGINEERING AND KNOWLEDGE ENGINEERING, 2024, 34 (08) : 1251 - 1279
  • [10] Knowledge Distillation-Based Multilingual Fusion Code Retrieval
    Li, Wen
    Xu, Junfei
    Chen, Qi
    ALGORITHMS, 2022, 15 (01)