A Lightweight Malware Detection Model Based on Knowledge Distillation

被引:0
|
作者
Miao, Chunyu [1 ]
Kou, Liang [2 ]
Zhang, Jilin [2 ]
Dong, Guozhong [3 ]
机构
[1] Zhejiang Normal Univ, Res Ctr Network Applicat Secur, Jinhua 321017, Peoples R China
[2] Hangzhou Dianzi Univ, Coll Cyberspace, Hangzhou 310005, Peoples R China
[3] Pengcheng Lab, Dept New Networks, Shenzhen 518066, Peoples R China
关键词
malware detection; pre-training models; knowledge distillation; lightweight models;
D O I
10.3390/math12244009
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The extremely destructive nature of malware has become a major threat to Internet security. The research on malware detection techniques has been evolving. Deep learning-based malware detection methods have achieved good results by using large-scale, pre-trained models. However, these models are complex, have large parameters, and require a large amount of hardware resources and have a high inference time cost when applied. To address this challenge, this paper proposes DistillMal, a new method for lightweight malware detection based on knowledge distillation, which improves performance by using a student network to learn valuable cueing knowledge from a teacher network to achieve a lightweight model. We conducted extensive experiments on two new datasets and showed that the student network model's performance is very close to that of the original model and the outperforms it on some metrics. Our approach helps address the resource constraints and computational challenges faced by traditional deep learning large models. Our research highlights the potential of using knowledge distillation to develop lightweight malware detection models.
引用
收藏
页数:13
相关论文
共 50 条
  • [31] A Lightweight Network-based Android Malware Detection System
    Sanz, Igor Jochem
    Lopez, Martin Andreoni
    Viegas, Eduardo Kugler
    Sanches, Vinicius Rodrigues
    2020 IFIP NETWORKING CONFERENCE AND WORKSHOPS (NETWORKING), 2020, : 695 - 703
  • [32] Lightweight image dehazing networks based on soft knowledge distillation
    Tran, Le-Anh
    Park, Dong-Chul
    VISUAL COMPUTER, 2024, : 4047 - 4066
  • [33] Lightweight remote sensing scene classification based on knowledge distillation
    Zhang, Chong-Yang
    Wang, Bin
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 2024, 43 (05) : 684 - 695
  • [34] Incremental event detection via an improved knowledge distillation based model
    Lin, Yi
    Xu, Changhua
    Yu, Hang
    Tian, Pinzhuo
    Luo, Xiangfeng
    NEUROCOMPUTING, 2023, 551
  • [35] Knowledge Distillation based Compact Model Learning Method for Object Detection
    Ko, Jong Gook
    Yoo, Wonyoung
    11TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE: DATA, NETWORK, AND AI IN THE AGE OF UNTACT (ICTC 2020), 2020, : 1276 - 1278
  • [36] A Defect Detection Model for Industrial Products Based on Attention and Knowledge Distillation
    Zhang, Ze-Kai
    Zhou, Ming-Le
    Shao, Rui
    Li, Min
    Li, Gang
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [37] A Lightweight Multi-Source Fast Android Malware Detection Model
    Peng, Tao
    Hu, Bochao
    Liu, Junping
    Huang, Junjie
    Zhang, Zili
    He, Ruhan
    Hu, Xinrong
    APPLIED SCIENCES-BASEL, 2022, 12 (11):
  • [38] Knowledge distillation for object detection with diffusion model
    Zhang, Yi
    Long, Junzong
    Li, Chunrui
    NEUROCOMPUTING, 2025, 636
  • [39] Lightweight Underwater Target Detection Algorithm Based on Dynamic Sampling Transformer and Knowledge-Distillation Optimization
    Chen, Liang
    Yang, Yuyi
    Wang, Zhenheng
    Zhang, Jian
    Zhou, Shaowu
    Wu, Lianghong
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (02)
  • [40] A Multi-Level Adaptive Lightweight Net for Damaged Road Marking Detection Based on Knowledge Distillation
    Wang, Junwei
    Zeng, Xiangqiang
    Wang, Yong
    Ren, Xiang
    Wang, Dongliang
    Qu, Wenqiu
    Liao, Xiaohan
    Pan, Peifen
    REMOTE SENSING, 2024, 16 (14)