A Lightweight Malware Detection Model Based on Knowledge Distillation

被引:0
|
作者
Miao, Chunyu [1 ]
Kou, Liang [2 ]
Zhang, Jilin [2 ]
Dong, Guozhong [3 ]
机构
[1] Zhejiang Normal Univ, Res Ctr Network Applicat Secur, Jinhua 321017, Peoples R China
[2] Hangzhou Dianzi Univ, Coll Cyberspace, Hangzhou 310005, Peoples R China
[3] Pengcheng Lab, Dept New Networks, Shenzhen 518066, Peoples R China
关键词
malware detection; pre-training models; knowledge distillation; lightweight models;
D O I
10.3390/math12244009
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
The extremely destructive nature of malware has become a major threat to Internet security. The research on malware detection techniques has been evolving. Deep learning-based malware detection methods have achieved good results by using large-scale, pre-trained models. However, these models are complex, have large parameters, and require a large amount of hardware resources and have a high inference time cost when applied. To address this challenge, this paper proposes DistillMal, a new method for lightweight malware detection based on knowledge distillation, which improves performance by using a student network to learn valuable cueing knowledge from a teacher network to achieve a lightweight model. We conducted extensive experiments on two new datasets and showed that the student network model's performance is very close to that of the original model and the outperforms it on some metrics. Our approach helps address the resource constraints and computational challenges faced by traditional deep learning large models. Our research highlights the potential of using knowledge distillation to develop lightweight malware detection models.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Lightweight Spectrum Prediction Based on Knowledge Distillation
    Cheng, Runmeng
    Zhang, Jianzhao
    Deng, Junquan
    Zhu, Yanping
    RADIOENGINEERING, 2023, 32 (04) : 469 - 478
  • [22] Lightweight One-Stage Maize Leaf Disease Detection Model with Knowledge Distillation
    Hu, Yanxin
    Liu, Gang
    Chen, Zhiyu
    Liu, Jiaqi
    Guo, Jianwei
    AGRICULTURE-BASEL, 2023, 13 (09):
  • [23] A Malware Classification Method Based on Knowledge Distillation and Feature Fusion
    Guan, Xin
    Zhang, Guodong
    IEEE ACCESS, 2025, 13 : 51268 - 51276
  • [24] KD-LightNet: A Lightweight Network Based on Knowledge Distillation for Industrial Defect Detection
    Liu, Jinhai
    Li, Hengguang
    Zuo, Fengyuan
    Zhao, Zhen
    Lu, Senxiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [25] Domain adaptation and knowledge distillation for lightweight pavement crack detection
    Xiao, Tianhao
    Pang, Rong
    Liu, Huijun
    Yang, Chunhua
    Li, Ao
    Niu, Chenxu
    Ruan, Zhimin
    Xu, Ling
    Ge, Yongxin
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 263
  • [26] A lightweight crack segmentation network based on knowledge distillation
    Wang, Wenjun
    Su, Chao
    Han, Guohui
    Zhang, Heng
    JOURNAL OF BUILDING ENGINEERING, 2023, 76
  • [27] A Face Forgery Video Detection Model Based on Knowledge Distillation
    Liang, Haobo
    Leng, Yingxiong
    Luo, Jinman
    Chen, Jie
    Guo, Xiaoji
    27TH IEEE/ACIS INTERNATIONAL SUMMER CONFERENCE ON SOFTWARE ENGINEERING ARTIFICIAL INTELLIGENCE NETWORKING AND PARALLEL/DISTRIBUTED COMPUTING, SNPD 2024-SUMMER, 2024, : 50 - 55
  • [28] TinyDroid: A Lightweight and Efficient Model for Android Malware Detection and Classification
    Chen, Tieming
    Mao, Qingyu
    Yang, Yimin
    Lv, Mingqi
    Zhu, Jianming
    MOBILE INFORMATION SYSTEMS, 2018, 2018
  • [29] Feature knowledge distillation-based model lightweight for prohibited item detection in X-ray security inspection images
    Ren, Yu
    Zhao, Lun
    Zhang, Yongtao
    Liu, Yiyao
    Yang, Jinfeng
    Zhang, Haigang
    Lei, Baiying
    ADVANCED ENGINEERING INFORMATICS, 2025, 65
  • [30] Reconstructed Graph Neural Network With Knowledge Distillation for Lightweight Anomaly Detection
    Zhou, Xiaokang
    Wu, Jiayi
    Liang, Wei
    Wang, Kevin I-Kai
    Yan, Zheng
    Yang, Laurence T.
    Jin, Qun
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 11817 - 11828