A Lightweight Malware Detection Model Based on Knowledge Distillation

被引:0
|
作者
Miao, Chunyu [1 ]
Kou, Liang [2 ]
Zhang, Jilin [2 ]
Dong, Guozhong [3 ]
机构
[1] Zhejiang Normal Univ, Res Ctr Network Applicat Secur, Jinhua 321017, Peoples R China
[2] Hangzhou Dianzi Univ, Coll Cyberspace, Hangzhou 310005, Peoples R China
[3] Pengcheng Lab, Dept New Networks, Shenzhen 518066, Peoples R China
关键词
malware detection; pre-training models; knowledge distillation; lightweight models;
D O I
10.3390/math12244009
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The extremely destructive nature of malware has become a major threat to Internet security. The research on malware detection techniques has been evolving. Deep learning-based malware detection methods have achieved good results by using large-scale, pre-trained models. However, these models are complex, have large parameters, and require a large amount of hardware resources and have a high inference time cost when applied. To address this challenge, this paper proposes DistillMal, a new method for lightweight malware detection based on knowledge distillation, which improves performance by using a student network to learn valuable cueing knowledge from a teacher network to achieve a lightweight model. We conducted extensive experiments on two new datasets and showed that the student network model's performance is very close to that of the original model and the outperforms it on some metrics. Our approach helps address the resource constraints and computational challenges faced by traditional deep learning large models. Our research highlights the potential of using knowledge distillation to develop lightweight malware detection models.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] LKD-STNN: A Lightweight Malicious Traffic Detection Method for Internet of Things Based on Knowledge Distillation
    Zhu, Shizhou
    Xu, Xiaolong
    Zhao, Juan
    Xiao, Fu
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (04): : 6438 - 6453
  • [42] Knowledge Distillation Based on Pruned Model
    Liu, Cailing
    Zhang, Hongyi
    Chen, Deyi
    BLOCKCHAIN AND TRUSTWORTHY SYSTEMS, BLOCKSYS 2019, 2020, 1156 : 598 - 603
  • [43] Spatial-temporal knowledge distillation for lightweight network traffic anomaly detection
    Wang, Xintong
    Wang, Zixuan
    Wang, Enliang
    Sun, Zhixin
    COMPUTERS & SECURITY, 2024, 137
  • [44] A lightweight deep learning-based android malware detection framework
    Ma, Runze
    Yin, Shangnan
    Feng, Xia
    Zhu, Huijuan
    Sheng, Victor S.
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 255
  • [45] Android Malware Detection Technology Based on Lightweight Convolutional Neural Networks
    Ye, Genchao
    Zhang, Jian
    Li, Huanzhou
    Tang, Zhangguo
    Lv, Tianzi
    SECURITY AND COMMUNICATION NETWORKS, 2022, 2022
  • [46] TGNet: A Lightweight Infrared Thermal Image Gesture Recognition Network Based on Knowledge Distillation and Model Pruning
    Chen, L.
    Sun, Q.
    Xu, Z.
    Liao, Y.
    2024 CROSS STRAIT RADIO SCIENCE AND WIRELESS TECHNOLOGY CONFERENCE, CSRSWTC 2024, 2024, : 96 - 98
  • [47] Lightweight Edge-side Fault Diagnosis Based on Knowledge Distillation
    Shang, Yingjun
    Feng, Tao
    Huo, Yonghua
    Duan, Yongcun
    Long, Yuhan
    2022 IEEE 14TH INTERNATIONAL CONFERENCE ON ADVANCED INFOCOMM TECHNOLOGY (ICAIT 2022), 2022, : 348 - 353
  • [48] A Lightweight Object Counting Network Based on Density Map Knowledge Distillation
    Shen, Zhilong
    Li, Guoquan
    Xia, Ruiyang
    Meng, Hongying
    Huang, Zhengwen
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) : 1492 - 1505
  • [49] Sound Event Detection System Based on VGGSKCCT Model Architecture with Knowledge Distillation
    Huang, Sung-Jen
    Liu, Chia-Chuan
    Chen, Chia-Ping
    APPLIED ARTIFICIAL INTELLIGENCE, 2023, 37 (01)
  • [50] Lightweight fault diagnosis method in embedded system based on knowledge distillation
    Gong, Ran
    Wang, Chenlin
    Li, Jinxiao
    Xu, Yi
    JOURNAL OF MECHANICAL SCIENCE AND TECHNOLOGY, 2023, 37 (11) : 5649 - 5660