Towards Continuous Benchmarking: An Automated Performance Evaluation Framework for High Performance Software

被引:3
|
作者
Anzt, Hartwig [1 ,2 ]
Chen, Yen-Chen [3 ]
Cojean, Terry [1 ]
Dongarra, Jack [2 ,4 ,5 ]
Flegar, Goran [6 ]
Nayak, Pratik [1 ]
Quintana-Orti, Enrique S. [7 ]
Tsai, Yuhsiang M. [3 ]
Wang, Weichung [3 ]
机构
[1] Karlsruhe Inst Technol, Karlsruhe, Germany
[2] Univ Tennessee, Knoxville, TN 37996 USA
[3] Taiwan Natl Univ, Taipei, Taiwan
[4] Oak Ridge Natl Lab, Oak Ridge, TN USA
[5] Univ Manchester, Manchester, Lancs, England
[6] Univ Jaime I, Castellon de La Plana, Castello, Spain
[7] Univ Politecn Valencia, Valencia, Spain
关键词
interactive performance visualization; automated performance benchmarking; continuous integration; healthy software lifecycle;
D O I
10.1145/3324989.3325719
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We present an automated performance evaluation framework that enables an automated workflow for testing and performance evaluation of software libraries. Integrating this component into an ecosystem enables sustainable software development, as a community effort, via a web application for interactively evaluating the performance of individual software components. The performance evaluation tool is based exclusively on web technologies, which removes the burden of downloading performance data or installing additional software. We employ this framework for the GINKGO software ecosystem, but the framework can be used with essentially any software project, including the comparison between different software libraries. The Continuous Integration (CI) framework of GINKGO is also extended to automatically run a benchmark suite on predetermined HPC systems, store the state of the machine and the environment along with the compiled binaries, and collect results in a publicly accessible performance data repository based on Git. The GINKGO performance explorer (GPE) can be used to retrieve the performance data from the repository, and visualizes it in a web browser. GPE also implements an interface that allows users to write scripts, archived in a Git repository, to extract particular data, compute particular metrics, and visualize them in many different formats (as specified by the script). The combination of these approaches creates a workflow which enables performance reproducibility and software sustainability of scientific software. In this paper, we present example scripts that extract and visualize performance data for GINKGO'S SpMV kernels that allow users to identify the optimal kernel for specific problem characteristics.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Towards Automated Continuous Performance Benchmarking of DHW and Combi Systems
    Schmelzer, Christoph
    Georgii, Matthias
    Kusyy, Oleg
    Orozaliev, Janybek
    Vajen, Klaus
    [J]. PROCEEDINGS OF THE ISES EUROSUN 2018 CONFERENCE - 12TH INTERNATIONAL CONFERENCE ON SOLAR ENERGY FOR BUILDINGS AND INDUSTRY, 2018, : 418 - 425
  • [2] Continuous Performance Benchmarking Framework for ROOT
    Shadura, Oksana
    Vassilev, Vassil
    Bockelman, Brian Paul
    [J]. 23RD INTERNATIONAL CONFERENCE ON COMPUTING IN HIGH ENERGY AND NUCLEAR PHYSICS (CHEP 2018), 2019, 214
  • [3] Towards a framework for automated performance tuning
    Cong, G.
    Seelam, S.
    Chung, I.
    Wen, H.
    Klepacki, D.
    [J]. 2009 IEEE INTERNATIONAL SYMPOSIUM ON PARALLEL & DISTRIBUTED PROCESSING, VOLS 1-5, 2009, : 2510 - 2517
  • [4] QUALITATIVE BENCHMARKING STUDY OF SOFTWARE FOR SWITCH PERFORMANCE EVALUATION
    Julca Coscol, Angel Bernardo
    Tapia Prado, Christian David
    Hilario Falcon, Francisco Manuel
    Corpus Giraldo, Cheyer Marcelino
    [J]. 3C TECNOLOGIA, 2021, 10 (04): : 35 - 49
  • [5] Towards Holistic Continuous Software Performance Assessment
    Ferme, Vincenzo
    Pautasso, Cesare
    [J]. ICPE'17: COMPANION OF THE 2017 ACM/SPEC INTERNATIONAL CONFERENCE ON PERFORMANCE ENGINEERING, 2017, : 159 - 164
  • [6] Towards a Reference Framework for Tactile Robot Performance and Safety Benchmarking
    Kirschner, Robin Jeanne
    Kurdas, Alexander
    Karacan, Kubra
    Junge, Philipp
    Birjandi, Seyed Ali Baradaran
    Mansfeld, Nico
    Abdolshah, Saeed
    Haddadin, Sami
    [J]. 2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 4290 - 4297
  • [7] Towards virtualized and automated software performance test architecture
    Gwang-Hun Kim
    Yeon-Gyun Kim
    Kyung-Yong Chung
    [J]. Multimedia Tools and Applications, 2015, 74 : 8745 - 8759
  • [8] Towards virtualized and automated software performance test architecture
    Kim, Gwang-Hun
    Kim, Yeon-Gyun
    Chung, Kyung-Yong
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2015, 74 (20) : 8745 - 8759
  • [9] An automated framework for business enterprise performance evaluation
    Anyaeche, C. O.
    Ighravwe, D. E.
    [J]. AFRICAN JOURNAL OF SCIENCE TECHNOLOGY INNOVATION & DEVELOPMENT, 2018, 10 (07): : 793 - 804
  • [10] Performance Evaluation Framework for Software Quality Engineering
    Lee, Joon-Sang
    Jeong, Oksoon
    Ryu, Jewhi
    [J]. 2009 NINTH INTERNATIONAL CONFERENCE ON QUALITY SOFTWARE (QSIC 2009), 2009, : 438 - 443