A Lightweight Malware Detection Model Based on Knowledge Distillation

被引:0
|
作者
Miao, Chunyu [1 ]
Kou, Liang [2 ]
Zhang, Jilin [2 ]
Dong, Guozhong [3 ]
机构
[1] Zhejiang Normal Univ, Res Ctr Network Applicat Secur, Jinhua 321017, Peoples R China
[2] Hangzhou Dianzi Univ, Coll Cyberspace, Hangzhou 310005, Peoples R China
[3] Pengcheng Lab, Dept New Networks, Shenzhen 518066, Peoples R China
关键词
malware detection; pre-training models; knowledge distillation; lightweight models;
D O I
10.3390/math12244009
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The extremely destructive nature of malware has become a major threat to Internet security. The research on malware detection techniques has been evolving. Deep learning-based malware detection methods have achieved good results by using large-scale, pre-trained models. However, these models are complex, have large parameters, and require a large amount of hardware resources and have a high inference time cost when applied. To address this challenge, this paper proposes DistillMal, a new method for lightweight malware detection based on knowledge distillation, which improves performance by using a student network to learn valuable cueing knowledge from a teacher network to achieve a lightweight model. We conducted extensive experiments on two new datasets and showed that the student network model's performance is very close to that of the original model and the outperforms it on some metrics. Our approach helps address the resource constraints and computational challenges faced by traditional deep learning large models. Our research highlights the potential of using knowledge distillation to develop lightweight malware detection models.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A Lightweight Android Malware Detection Framework Based on Knowledge Distillation
    Zhi, Yongbo
    Xi, Ning
    Liu, Yuanqing
    Hui, Honglei
    NETWORK AND SYSTEM SECURITY, NSS 2021, 2021, 13041 : 116 - 130
  • [2] Lightweight intrusion detection model based on CNN and knowledge distillation
    Wang, Long-Hui
    Dai, Qi
    Du, Tony
    Chen, Li-fang
    APPLIED SOFT COMPUTING, 2024, 165
  • [3] A Lightweight Pipeline Edge Detection Model Based on Heterogeneous Knowledge Distillation
    Zhu, Chengyuan
    Pu, Yanyun
    Lyu, Zhuoling
    Wu, Aonan
    Yang, Kaixiang
    Yang, Qinmin
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2024, 71 (12) : 5059 - 5063
  • [4] Lightweight detection network for bridge defects based on model pruning and knowledge distillation
    Guan, Bin
    Li, Junjie
    STRUCTURES, 2024, 62
  • [5] An Efficient and Lightweight Approach for Intrusion Detection based on Knowledge Distillation
    Zhao, Ruijie
    Chen, Yu
    Wang, Yijun
    Shi, Yong
    Xue, Zhi
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2021), 2021,
  • [6] Yarn state detection based on lightweight network and knowledge distillation
    Ren G.
    Tu J.
    Li Y.
    Qiu Z.
    Shi W.
    Fangzhi Xuebao/Journal of Textile Research, 2023, 44 (09): : 205 - 212
  • [7] Lightweight Tunnel Defect Detection Algorithm Based on Knowledge Distillation
    Zhu, Anfu
    Wang, Bin
    Xie, Jiaxiao
    Ma, Congxiao
    ELECTRONICS, 2023, 12 (15)
  • [8] Knowledge Distillation Facilitates the Lightweight and Efficient Plant Diseases Detection Model
    Huang, Qianding
    Wu, Xingcai
    Wang, Qi
    Dong, Xinyu
    Qin, Yongbin
    Wu, Xue
    Gao, Yangyang
    Hao, Gefei
    PLANT PHENOMICS, 2023, 5
  • [9] Knowledge Distillation Facilitates the Lightweight and Efficient Plant Diseases Detection Model
    Huang, Qianding
    Wu, Xingcai
    Wang, Qi
    Dong, Xinyu
    Qin, Yongbin
    Wu, Xue
    Gao, Yangyang
    Hao, Gefei
    PLANT PHENOMICS, 2023, 5
  • [10] Lightweight Network Traffic Classification Model Based on Knowledge Distillation
    Wu, Yanhui
    Zhang, Meng
    WEB INFORMATION SYSTEMS ENGINEERING - WISE 2021, PT II, 2021, 13081 : 107 - 121