Delta-Bench: Differential Benchmark for Static Analysis Security Testing Tools

被引:14
|
作者
Pashchenko, Ivan [1 ]
Dashevskyi, Stanislav [1 ]
Massacci, Fabio [1 ]
机构
[1] Univ Trento, Trento, Italy
关键词
Static Analysis; Static Application Security Testing Tool; Vulnerability; Software Security; Large-scale Benchmark;
D O I
10.1109/ESEM.2017.24
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Background: Static analysis security testing (SAST) tools may be evaluated using synthetic micro benchmarks and benchmarks based on real-world software. Aims: The aim of this study is to address the limitations of the existing SAST tool benchmarks: lack of vulnerability realism, uncertain ground truth, and large amount of findings not related to analyzed vulnerability. Method: We propose Delta-Bench - a novel approach for the automatic construction of benchmarks for SAST tools based on differencing vulnerable and fixed versions in Free and Open Source (FOSS) repositories. To test our approach, we used 7 state of the art SAST tools against 70 revisions of four major versions of Apache Tomcat spanning 62 distinct Common Vulnerabilities and Exposures (CVE) fixes and vulnerable files totalling over 100K lines of code as the source of ground truth vulnerabilities. Results: Our experiment allows us to draw interesting conclusions (e.g., tools perform differently due to the selected benchmark). Conclusions: Delta-Bench allows SAST tools to be automatically evaluated on the real-world historical vulnerabilities using only the findings that a tool produced for the analyzed vulnerability.
引用
收藏
页码:163 / 168
页数:6
相关论文
共 38 条
  • [1] FOSS Version Differentiation as a Benchmark for Static Analysis Security Testing Tools
    Pashchenko, Ivan
    ESEC/FSE 2017: PROCEEDINGS OF THE 2017 11TH JOINT MEETING ON FOUNDATIONS OF SOFTWARE ENGINEERING, 2017, : 1056 - 1058
  • [2] Benchmark Requirements for Assessing Software Security Vulnerability Testing Tools
    Parizi, Reza M.
    Qian, Kai
    Shahriar, Hossain
    Wu, Fan
    Tao, Lixin
    2018 IEEE 42ND ANNUAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE (COMPSAC), VOL 1, 2018, : 825 - 826
  • [3] An Extensive Comparison of Static Application Security Testing Tools
    Esposito, Matteo
    Falaschi, Valentina
    Falessi, Davide
    arXiv, 2024,
  • [4] An Extensive Comparison of Static Application Security Testing Tools
    Esposito, Matteo
    Falaschi, Valentina
    Falessi, Davide
    PROCEEDINGS OF 2024 28TH INTERNATION CONFERENCE ON EVALUATION AND ASSESSMENT IN SOFTWARE ENGINEERING, EASE 2024, 2024, : 69 - 78
  • [5] An empirical study of security warnings from static application security testing tools
    Aloraini, Bushra
    Nagappan, Meiyappan
    German, Daniel M.
    Hayashi, Shinpei
    Higo, Yoshiki
    JOURNAL OF SYSTEMS AND SOFTWARE, 2019, 158
  • [6] Benchmarking Static Analysis Tools for Web Security
    Nunes, Paulo
    Medeiros, Iberia
    Fonseca, Jose C.
    Neves, Nuno
    Correia, Miguel
    Vieira, Marco
    IEEE TRANSACTIONS ON RELIABILITY, 2018, 67 (03) : 1159 - 1175
  • [7] Evaluation of Static Analysis Tools for Software Security
    AlBreiki, Hamda Hasan
    Mahmoud, Qusay H.
    2014 10TH INTERNATIONAL CONFERENCE ON INNOVATIONS IN INFORMATION TECHNOLOGY (IIT), 2014, : 93 - 98
  • [8] Automatic Testing and Benchmarking for Configurable Static Analysis Tools
    Mordahl, Austin
    PROCEEDINGS OF THE 32ND ACM SIGSOFT INTERNATIONAL SYMPOSIUM ON SOFTWARE TESTING AND ANALYSIS, ISSTA 2023, 2023, : 1532 - 1536
  • [9] Comparison and Evaluation on Static Application Security Testing (SAST) Tools for Java']Java
    Li, Kaixuan
    Chen, Sen
    Fan, Lingling
    Feng, Ruitao
    Liu, Han
    Liu, Chengwei
    Liu, Yang
    Chen, Yixiang
    PROCEEDINGS OF THE 31ST ACM JOINT MEETING EUROPEAN SOFTWARE ENGINEERING CONFERENCE AND SYMPOSIUM ON THE FOUNDATIONS OF SOFTWARE ENGINEERING, ESEC/FSE 2023, 2023, : 921 - 933
  • [10] Semgrep*: Improving the Limited Performance of Static Application Security Testing (SAST) Tools
    Bennett, Gareth
    Hall, Tracy
    Winter, Emily
    Counsell, Steve
    PROCEEDINGS OF 2024 28TH INTERNATION CONFERENCE ON EVALUATION AND ASSESSMENT IN SOFTWARE ENGINEERING, EASE 2024, 2024, : 614 - 623