LineVul: A Transformer-based Line-Level Vulnerability Prediction

被引:135
|
作者
Fu, Michael [1 ]
Tantithamthavorn, Chakkrit [1 ]
机构
[1] Monash Univ, Clayton, Vic, Australia
基金
澳大利亚研究理事会;
关键词
D O I
10.1145/3524842.3528452
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Software vulnerabilities are prevalent in software systems, causing a variety of problems including deadlock, information loss, or system failures. Thus, early predictions of software vulnerabilities are critically important in safety-critical software systems. Various ML/DL-based approaches have been proposed to predict vulnerabilities at the file/function/method level. Recently, IVDetect (a graph-based neural network) is proposed to predict vulnerabilities at the function level. Yet, the IVDetect approach is still inaccurate and coarse-grained. In this paper, we propose LINEVUL, a Transformer-based line-level vulnerability prediction approach in order to address several limitations of the state-of-the-art IVDetect approach. Through an empirical evaluation of a large-scale real-world dataset with 188k+ C/C++ functions, we show that LINEVUL achieves (1) 160%-379% higher F1-measure for function-level predictions; (2) 12%-25% higher Top-10 Accuracy for line-level predictions; and (3) 29%-53% less Effort@20%Recall than the baseline approaches, highlighting the significant advancement of LINEVUL towards more accurate and more cost-effective line-level vulnerability predictions. Our additional analysis also shows that our LINEVUL is also very accurate (75%-100%) for predicting vulnerable functions affected by the Top-25 most dangerous CWEs, highlighting the potential impact of our LINEVUL in real-world usage scenarios.
引用
收藏
页码:608 / 620
页数:13
相关论文
共 50 条
  • [41] Operational prediction of solar flares using a transformer-based framework
    Yasser Abduallah
    Jason T. L. Wang
    Haimin Wang
    Yan Xu
    Scientific Reports, 13
  • [42] Transformer-Based Prediction of Charging Time for Pure Electric Vehicles
    Hu, Jie
    Chen, Lin
    Wang, Zhihong
    Qing, Haihua
    Wang, Haojie
    Qiche Gongcheng/Automotive Engineering, 2024, 46 (11): : 2059 - 2067
  • [43] Transformer-Based Mechanical Property Prediction for Polymer Matrix Composites
    Lee, Jaewook
    Son, Jinkyung
    Lim, Juri
    Kim, In
    Kim, Seonwoo
    Cho, Namjung
    Choi, Woojin
    Shin, Dongil
    KOREAN JOURNAL OF CHEMICAL ENGINEERING, 2024, 41 (11) : 3005 - 3018
  • [44] Telecontext-Enhanced Recursive Interactive Attention Fusion Method for Line-Level Defect Prediction
    He, Haitao
    Yan, Bingjian
    Xu, Ke
    Yu, Lu
    CMC-COMPUTERS MATERIALS & CONTINUA, 2025, 82 (02): : 2077 - 2108
  • [45] Transformer-Based BiLSTM for Aspect-Level Sentiment Classification
    Cai, Tao
    Yu, Baocheng
    Xu, Wenxia
    2021 4TH INTERNATIONAL CONFERENCE ON ROBOTICS, CONTROL AND AUTOMATION ENGINEERING (RCAE 2021), 2021, : 138 - 142
  • [46] Operational prediction of solar flares using a transformer-based framework
    Abduallah, Yasser
    Wang, Jason T. L.
    Wang, Haimin
    Xu, Yan
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [47] Transformer-based sensor failure prediction and classification framework for UAVs
    Ahmad, Muhammad Waqas
    Akram, Muhammad Usman
    Mohsan, Mashood Mohammad
    Saghar, Kashif
    Ahmad, Rashid
    Butt, Wasi Haider
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 248
  • [48] TrEP: Transformer-Based Evidential Prediction for Pedestrian Intention with Uncertainty
    Zhang, Zhengming
    Tian, Renran
    Ding, Zhengming
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3534 - 3542
  • [49] Multi-Level Transformer-Based Social Relation Recognition
    Wang, Yuchen
    Qing, Linbo
    Wang, Zhengyong
    Cheng, Yongqiang
    Peng, Yonghong
    SENSORS, 2022, 22 (15)
  • [50] Character-Level Transformer-Based Neural Machine Translation
    Banar, Nikolay
    Daelemans, Walter
    Kestemont, Mike
    2020 4TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2020, 2020, : 149 - 156