Towards a Transformer-Based Pre-trained Model for IoT Traffic Classification

被引:0
|
作者
Bazaluk, Bruna [1 ]
Hamdan, Mosab [2 ]
Ghaleb, Mustafa [2 ]
Gismalla, Mohammed S. M. [2 ,3 ]
da Silva, Flavio S. Correa [1 ]
Batista, Daniel Macedo [1 ]
机构
[1] Univ Sao Paulo, Dept Comp Sci, Sao Paulo, Brazil
[2] KFUPM, Interdisciplinary Res Ctr Intelligent Secure Syst, Dhahran, Saudi Arabia
[3] KFUPM, Ctr Commun Syst & Sensing, Dhahran, Saudi Arabia
基金
巴西圣保罗研究基金会;
关键词
IoT; Traffic Classification; Transformers; Feature Selection; MQTT; Machine Learning; Deep Learning; FEATURE-SELECTION;
D O I
10.1109/NOMS59830.2024.10575448
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The classification of IoT traffic is important to improve efficiency and security of IoT-based networks, and state-of-the-art classification methods are based on Deep Learning. However, most of the current results require a big amount of data to be trained. This way, in real life situations, where there is a scarce amount of IoT traffic data, the models would not perform so well. Consequently, these models underperform outside their initial training conditions and fail to capture the complex characteristics of network traffic, rendering them inefficient and unreliable in real-world applications. In this paper, we propose a novel IoT Traffic Classification Transformer (ITCT) approach, utilizing the state-of-the-art transformer-based model named TabTransformer. The model, which is pre-trained on a large labeled MQTT-based IoT traffic dataset and may be fine-tuned with a small set of labeled data, showed promising results in various traffic classification tasks. Our experiments demonstrated that the ITCT model significantly outperforms existing models, achieving an overall accuracy of 82%. To support reproducibility and collaborative development, all associated code is made publicly available.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Pre-trained Transformer-based Classification for Automated Patentability Examination
    Lo, Hao-Cheng
    Chu, Jung-Mei
    [J]. 2021 IEEE ASIA-PACIFIC CONFERENCE ON COMPUTER SCIENCE AND DATA ENGINEERING (CSDE), 2021,
  • [2] A survey of transformer-based multimodal pre-trained modals
    Han, Xue
    Wang, Yi-Tong
    Feng, Jun-Lan
    Deng, Chao
    Chen, Zhan-Heng
    Huang, Yu-An
    Su, Hui
    Hu, Lun
    Hu, Peng-Wei
    [J]. NEUROCOMPUTING, 2023, 515 : 89 - 106
  • [3] Pre-trained transformer-based language models for Sundanese
    Wilson Wongso
    Henry Lucky
    Derwin Suhartono
    [J]. Journal of Big Data, 9
  • [4] Pre-trained transformer-based language models for Sundanese
    Wongso, Wilson
    Lucky, Henry
    Suhartono, Derwin
    [J]. JOURNAL OF BIG DATA, 2022, 9 (01)
  • [5] Commit-Level Software Change Intent Classification Using a Pre-Trained Transformer-Based Code Model
    Hericko, Tjasa
    Sumak, Bostjan
    Karakatic, Saso
    [J]. MATHEMATICS, 2024, 12 (07)
  • [6] Pre-Trained Transformer-Based Models for Text Classification Using Low-Resourced Ewe Language
    Agbesi, Victor Kwaku
    Chen, Wenyu
    Yussif, Sophyani Banaamwini
    Hossin, Md Altab
    Ukwuoma, Chiagoziem C.
    Kuadey, Noble A.
    Agbesi, Colin Collinson
    Samee, Nagwan Abdel
    Jamjoom, Mona M.
    Al-antari, Mugahed A.
    [J]. SYSTEMS, 2024, 12 (01):
  • [7] Multi-task Active Learning for Pre-trained Transformer-based Models
    Rotman, Guy
    Reichart, Roi
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2022, 10 : 1209 - 1228
  • [8] Generative Pre-Trained Transformer-Based Reinforcement Learning for Testing Web Application Firewalls
    Liang, Hongliang
    Li, Xiangyu
    Xiao, Da
    Liu, Jie
    Zhou, Yanjie
    Wang, Aibo
    Li, Jin
    [J]. IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (01) : 309 - 324
  • [9] A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models
    Zhang, Hanqing
    Song, Haolin
    Li, Shaoyu
    Zhou, Ming
    Song, Dawei
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [10] Pre-trained Transformer-Based Citation Context-Aware Citation Network Embeddings
    Ohagi, Masaya
    Aizawa, Akiko
    [J]. 2022 ACM/IEEE JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL), 2022,