Towards Continuous Benchmarking: An Automated Performance Evaluation Framework for High Performance Software

被引:3
|
作者
Anzt, Hartwig [1 ,2 ]
Chen, Yen-Chen [3 ]
Cojean, Terry [1 ]
Dongarra, Jack [2 ,4 ,5 ]
Flegar, Goran [6 ]
Nayak, Pratik [1 ]
Quintana-Orti, Enrique S. [7 ]
Tsai, Yuhsiang M. [3 ]
Wang, Weichung [3 ]
机构
[1] Karlsruhe Inst Technol, Karlsruhe, Germany
[2] Univ Tennessee, Knoxville, TN 37996 USA
[3] Taiwan Natl Univ, Taipei, Taiwan
[4] Oak Ridge Natl Lab, Oak Ridge, TN USA
[5] Univ Manchester, Manchester, Lancs, England
[6] Univ Jaime I, Castellon de La Plana, Castello, Spain
[7] Univ Politecn Valencia, Valencia, Spain
关键词
interactive performance visualization; automated performance benchmarking; continuous integration; healthy software lifecycle;
D O I
10.1145/3324989.3325719
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We present an automated performance evaluation framework that enables an automated workflow for testing and performance evaluation of software libraries. Integrating this component into an ecosystem enables sustainable software development, as a community effort, via a web application for interactively evaluating the performance of individual software components. The performance evaluation tool is based exclusively on web technologies, which removes the burden of downloading performance data or installing additional software. We employ this framework for the GINKGO software ecosystem, but the framework can be used with essentially any software project, including the comparison between different software libraries. The Continuous Integration (CI) framework of GINKGO is also extended to automatically run a benchmark suite on predetermined HPC systems, store the state of the machine and the environment along with the compiled binaries, and collect results in a publicly accessible performance data repository based on Git. The GINKGO performance explorer (GPE) can be used to retrieve the performance data from the repository, and visualizes it in a web browser. GPE also implements an interface that allows users to write scripts, archived in a Git repository, to extract particular data, compute particular metrics, and visualize them in many different formats (as specified by the script). The combination of these approaches creates a workflow which enables performance reproducibility and software sustainability of scientific software. In this paper, we present example scripts that extract and visualize performance data for GINKGO'S SpMV kernels that allow users to identify the optimal kernel for specific problem characteristics.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Towards a Framework for Continuous Software Engineering
    Barcellos, Monalessa Perini
    [J]. 34TH BRAZILIAN SYMPOSIUM ON SOFTWARE ENGINEERING, SBES 2020, 2020, : 626 - 631
  • [22] An automatic framework for efficient software performance evaluation and optimization
    Hsu, Chih-Chieh
    Devetsikiotis, Michael
    [J]. 40TH ANNUAL SIMULATION SYMPOSIUM, PROCEEDINGS, 2007, : 99 - +
  • [23] A framework for benchmarking performance of switch fabrics
    Ayandeh, S
    [J]. ICT'2003: 10TH INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS, VOLS I AND II, CONFERENCE PROCEEDINGS, 2003, : 1650 - 1655
  • [24] Benchmarking for sewer operators as a first step towards performance evaluation in Austria
    Ertl, T.
    Herda, R.
    Haberl, R.
    [J]. WATER PRACTICE AND TECHNOLOGY, 2007, 2 (01):
  • [25] A benchmarking framework for performance evaluation of statistical wind power forecasting models
    Sopena, Juan Manuel Gonzalez
    Pakrashi, Vikram
    Ghosh, Bidisha
    [J]. SUSTAINABLE ENERGY TECHNOLOGIES AND ASSESSMENTS, 2023, 57
  • [26] A framework for automated generation of architectural feedback from software performance analysis
    Cortellessa, Vittorio
    Frittella, Laurento
    [J]. FORMAL METHODS AND STOCHASTIC MODELS FOR PERFORMANCE EVALUATION, 2007, 4748 : 171 - +
  • [27] Smart CloudBench - Automated Performance Benchmarking of the Cloud
    Chhetri, Mohan Baruwal
    Chichin, Sergei
    Vo, Quoc Bao
    Kowalczyk, Ryszard
    [J]. 2013 IEEE SIXTH INTERNATIONAL CONFERENCE ON CLOUD COMPUTING (CLOUD 2013), 2013, : 414 - 421
  • [28] Automated Performance Benchmarking Platform of IaaS Cloud
    Liu, Xu
    Fang, Dongxu
    Xu, Peng
    [J]. 2021 IEEE 20TH INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS (TRUSTCOM 2021), 2021, : 1402 - 1405
  • [29] A server for automated performance analysis of benchmarking data
    Mittelmann, HD
    Preussner, A
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2006, 21 (01): : 105 - 120
  • [30] Towards a framework for reliable performance evaluation in defect prediction
    Liu, Xutong
    Liu, Shiran
    Guo, Zhaoqiang
    Zhang, Peng
    Yang, Yibiao
    Liu, Huihui
    Lu, Hongmin
    Li, Yanhui
    Chen, Lin
    Zhou, Yuming
    [J]. SCIENCE OF COMPUTER PROGRAMMING, 2024, 238