BOOTABLE: Bioinformatics benchmark tool suite for applications and hardware

被引:4
|
作者
Hanussek, Maximilian [1 ,2 ]
Bartusch, Felix [1 ]
Krueger, Jens [1 ]
机构
[1] Univ Tubingen, High Performance & Cloud Comp Grp ZDV, Wtichterstr 76, D-72074 Tubingen, Germany
[2] Univ Tubingen, Grp Appl Bioinformat, Sand 14, D-72076 Tubingen, Germany
关键词
Benchmark; Bioinformatics; Cloud; Scaling;
D O I
10.1016/j.future.2019.09.057
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The interest in analyzing biological data on a large scale has grown over the last years. Bioinformatics applications play an important role when it comes to the analysis of huge amounts of data. Due to the large amount of biological data and/or large problem spaces a considerable amount of computing resources is required to answer the raised research questions. In order to estimate which underlying hardware might be the most suitable for the bioinformatics tools applied, a well-defined benchmark suite is required. Such a benchmark suite can get useful in the case of purchasing hardware and even further for larger projects with the goal to establish a bioinformatics compute infrastructure. With this paper we present BOOTABLE, our bioinformatic benchmark suite. BOOTABLE currently contains seven popular and widely used bioinformatic applications representing a broad spectrum of usage characteristics. It further includes an automated installation procedure and all required datasets. Furthermore it includes functionalities to test any desired application with regards to resource consumption and scaling behavior. (C) 2019 Elsevier B.V. All rights reserved.
引用
收藏
页码:1016 / 1026
页数:11
相关论文
共 50 条
  • [21] Tool suite bridges hardware and software digital-design issues
    Desposito, J
    [J]. ELECTRONIC DESIGN, 1999, 47 (11) : 33 - +
  • [22] TailBench: A Benchmark Suite and Evaluation Methodology for Latency-Critical Applications
    Kasture, Harshad
    Sanchez, Daniel
    [J]. PROCEEDINGS OF THE 2016 IEEE INTERNATIONAL SYMPOSIUM ON WORKLOAD CHARACTERIZATION, 2016, : 3 - 12
  • [23] DSPBench: A Suite of Benchmark Applications for Distributed Data Stream Processing Systems
    Bordin, Maycon Viana
    Griebler, Dalvan
    Mencagli, Gabriele
    Geyer, Claudio F. R.
    Fernandes, Luiz Gustavo L.
    [J]. IEEE ACCESS, 2020, 8 : 222900 - 222917
  • [24] An Automatic Tool for Benchmark Testing of Cloud Applications
    Casola, Valentina
    De Benedictis, Alessandra
    Rak, Massimiliano
    Villano, Umberto
    [J]. CLOSER: PROCEEDINGS OF THE 7TH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND SERVICES SCIENCE, 2017, : 701 - 708
  • [25] Pattern Based Synthetic Benchmark Generation for Hardware Security Applications
    Meka, Juneeth Kumar
    Marupureddy, AmarKant
    Vemuri, Ranga
    [J]. PROCEEDINGS OF THE 37TH INTERNATIONAL CONFERENCE ON VLSI DESIGN, VLSID 2024 AND 23RD INTERNATIONAL CONFERENCE ON EMBEDDED SYSTEMS, ES 2024, 2024, : 461 - 466
  • [26] ParchMint: A Microfluidics Benchmark Suite
    Crites, Brian
    Sanka, Radhakrishna
    Lippai, Joshua
    McDaniel, Jeffrey
    Brisk, Philip
    Densmore, Douglas
    [J]. 2018 IEEE INTERNATIONAL SYMPOSIUM ON WORKLOAD CHARACTERIZATION (IISWC), 2018, : 78 - 79
  • [27] A benchmark suite for mobile robots
    Baltes, J
    [J]. 2000 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2000), VOLS 1-3, PROCEEDINGS, 2000, : 1101 - 1106
  • [28] Bench++ benchmark suite
    [J]. Dr Dobb's J Software Tools Prof Program, 10 (58):
  • [29] The Bench++ benchmark suite
    Orost, JM
    [J]. DR DOBBS JOURNAL, 1998, 23 (10): : 58 - +
  • [30] Laser Attack Benchmark Suite
    Amornpaisannon, Burin
    Diavastos, Andreas
    Peh, Li-Shiuan
    Carlson, Trevor E.
    [J]. 2020 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED-DESIGN (ICCAD), 2020,