FedKG: A Knowledge Distillation-Based Federated Graph Method for Social Bot Detection

被引:1
|
作者
Wang, Xiujuan [1 ]
Chen, Kangmiao [1 ]
Wang, Keke [1 ]
Wang, Zhengxiang [1 ]
Zheng, Kangfeng [2 ]
Zhang, Jiayue [1 ]
机构
[1] Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
[2] Beijing Univ Posts & Telecommun, Sch Cyberspace Secur, Beijing 100876, Peoples R China
基金
北京市自然科学基金;
关键词
social bot detection; federated learning; knowledge distillation; graph neural network;
D O I
10.3390/s24113481
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Malicious social bots pose a serious threat to social network security by spreading false information and guiding bad opinions in social networks. The singularity and scarcity of single organization data and the high cost of labeling social bots have given rise to the construction of federated models that combine federated learning with social bot detection. In this paper, we first combine the federated learning framework with the Relational Graph Convolutional Neural Network (RGCN) model to achieve federated social bot detection. A class-level cross entropy loss function is applied in the local model training to mitigate the effects of the class imbalance problem in local data. To address the data heterogeneity issue from multiple participants, we optimize the classical federated learning algorithm by applying knowledge distillation methods. Specifically, we adjust the client-side and server-side models separately: training a global generator to generate pseudo-samples based on the local data distribution knowledge to correct the optimization direction of client-side classification models, and integrating client-side classification models' knowledge on the server side to guide the training of the global classification model. We conduct extensive experiments on widely used datasets, and the results demonstrate the effectiveness of our approach in social bot detection in heterogeneous data scenarios. Compared to baseline methods, our approach achieves a nearly 3-10% improvement in detection accuracy when the data heterogeneity is larger. Additionally, our method achieves the specified accuracy with minimal communication rounds.
引用
收藏
页数:17
相关论文
共 50 条
  • [21] MobileNet and Knowledge Distillation-Based Automatic Scenario Recognition Method in Vehicle-to-Vehicle Systems
    Yang, Jie
    Wang, Yu
    Zao, Haitao
    Gui, Guan
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2022, 71 (10) : 11006 - 11016
  • [22] Domain-Aware Federated Social Bot Detection with Multi-Relational Graph Neural Networks
    Peng, Huailiang
    Zhang, Yujun
    Sun, Hao
    Bai, Xu
    Li, Yangyang
    Wang, Shuhai
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [23] A Distillation-based Attack Against Adversarial Training Defense for Smart Grid Federated Learning
    Bondok, Atef H.
    Mahmoud, Mohamed
    Badr, Mahmoud M.
    Fouda, Mostafa M.
    Alsabaan, Maazen
    2024 IEEE 21ST CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2024, : 963 - 968
  • [24] FedUA: An Uncertainty-Aware Distillation-Based Federated Learning Scheme for Image Classification
    Lee, Shao-Ming
    Wu, Ja-Ling
    INFORMATION, 2023, 14 (04)
  • [25] BotCL: a social bot detection model based on graph contrastive learning
    Li, Yan
    Li, Zhenyu
    Gong, Daofu
    Hu, Qian
    Lu, Haoyu
    KNOWLEDGE AND INFORMATION SYSTEMS, 2024, 66 (09) : 5185 - 5202
  • [26] Intention embedding method based social bot detection
    Niu, Hongfeng
    Li, Jiawei
    Song, Yunpeng
    Cai, Zhongmin
    Tongxin Xuebao/Journal on Communications, 45 (11): : 194 - 205
  • [27] Federated Learning Algorithm Based on Knowledge Distillation
    Jiang, Donglin
    Shan, Chen
    Zhang, Zhihui
    2020 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING (ICAICE 2020), 2020, : 163 - 167
  • [28] Knowledge Distillation-based Learning Model Propagation for Urban Air Mobility
    Xiong, Kai
    Xie, Juefei
    Wang, Zhihong
    Leng, Supeng
    2024 IEEE 99TH VEHICULAR TECHNOLOGY CONFERENCE, VTC2024-SPRING, 2024,
  • [29] A knowledge distillation-based deep interaction compressed network for CTR prediction
    Guan, Fei
    Qian, Cheng
    He, Feiyan
    KNOWLEDGE-BASED SYSTEMS, 2023, 275
  • [30] Minifying photometric stereo via knowledge distillation-based feature translation
    Han, Seungoh
    Park, Jinsun
    Cho, Donghyeon
    OPTICS EXPRESS, 2022, 30 (21) : 38284 - 38297