Always be Pre-Training: Representation Learning for Network Intrusion Detection with GNNs

被引:1
|
作者
Gu, Zhengyao [1 ]
Lopez, Diego Troy [2 ]
Alrahis, Lilas [3 ]
Sinanoglu, Ozgur [3 ]
机构
[1] NYU, Ctr Data Sci, New York, NY 10012 USA
[2] NYU, Res Technol Serv, New York, NY USA
[3] New York Univ Abu Dhabi, Abu Dhabi, U Arab Emirates
关键词
Intrusion detection; machine learning; graph neural network; NIDS; few-shot learning; self-supervised learning; INTERNET; THINGS; ATTACK; IOT;
D O I
10.1109/ISQED60706.2024.10528371
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Graph neural network-based network intrusion detection systems have recently demonstrated state-of-the-art performance on benchmark datasets. Nevertheless, these methods suffer from a reliance on target encoding for data pre-processing, limiting widespread adoption due to the associated need for annotated labels-a cost-prohibitive requirement. In this work, we propose a solution involving in-context pre-training and the utilization of dense representations for categorical features to jointly overcome the label-dependency limitation. Our approach exhibits remarkable data efficiency, achieving over 98% of the performance of the supervised state-of-the-art with less than 4% labeled data on the NF-UQ-NIDS-V2 dataset.
引用
收藏
页数:8
相关论文
共 50 条
  • [41] Layer-wise Pre-training Mechanism Based on Neural Network for Epilepsy Detection
    Lin, Zichao
    Gu, Zhenghui
    Li, Yinghao
    Yu, Zhuliang
    Li, Yuanqing
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 224 - 227
  • [42] Ensemble and Pre-Training Approach for Echo State Network and Extreme Learning Machine Models
    Tang, Lingyu
    Wang, Jun
    Wang, Mengyao
    Zhao, Chunyu
    ENTROPY, 2024, 26 (03)
  • [43] Label-efficient object detection via region proposal network pre-training
    Dong, Nanqing
    Ericsson, Linus
    Yang, Yongxin
    Leonardis, Ales
    Mcdonagh, Steven
    NEUROCOMPUTING, 2024, 577
  • [44] New Intent Discovery with Pre-training and Contrastive Learning
    Zhang, Yuwei
    Zhang, Haode
    Zhan, Li-Ming
    Wu, Xiao-Ming
    Lam, Albert Y. S.
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 256 - 269
  • [45] An Empirical Investigation of the Role of Pre-training in Lifelong Learning
    Mehta, Sanket Vaibhav
    Patil, Darshan
    Chandar, Sarath
    Strubell, Emma
    JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [46] Image Difference Captioning with Pre-training and Contrastive Learning
    Yao, Linli
    Wang, Weiying
    Jin, Qin
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 3108 - 3116
  • [47] Pre-training with Meta Learning for Chinese Word Segmentation
    Ke, Zhen
    Shi, Liang
    Sun, Songtao
    Meng, Erli
    Wang, Bin
    Qiu, Xipeng
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5514 - 5523
  • [48] Malbert: A novel pre-training method for malware detection
    Xu, Zhifeng
    Fang, Xianjin
    Yang, Gaoming
    COMPUTERS & SECURITY, 2021, 111
  • [49] User Behavior Pre-training for Online Fraud Detection
    Liu, Can
    Gao, Yuncong
    Sun, Li
    Feng, Jinghua
    Yang, Hao
    Ao, Xiang
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 3357 - 3365
  • [50] Learning Visual Prior via Generative Pre-Training
    Xie, Jinheng
    Ye, Kai
    Li, Yudong
    Li, Yuexiang
    Lin, Kevin Qinghong
    Zheng, Yefeng
    Shen, Linlin
    Shou, Mike Zheng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,