Software support for multiprocessor latency measurement and evaluation

被引:4
|
作者
Yan, Y [1 ]
Zhang, XD [1 ]
Ma, Q [1 ]
机构
[1] BEIJING AUTOMATED SYST CO LTD,COMP SYST ADVISORS GRP,BEIJING 100027,PEOPLES R CHINA
基金
美国国家科学基金会;
关键词
latency analysis; parallel computing scalability; performance graphical presentation; software tools; KSR multiprocessors;
D O I
10.1109/32.581326
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Parallel computing scalability evaluates the extent to which parallel programs and architectures can effectively utilize increasing numbers of processors. In this paper, we compare a group of existing scalability metrics and evaluation models with an experimental metric which uses network latency to measure and evaluate the scalability of parallel programs and architectures. To provide insight into dynamic system performance, we have developed an integrated software environment prototype for measuring and evaluating multiprocessor scalability performance, called Scale-Graph. Scale-Graph uses a graphical instrumentation monitor to collect, measure and analyze latency-related data, and to display scalability performance based on various program execution patterns. The graphical software tool is X-window based and currently implemented on standard workstations to analyze performance data of the KSR-1, a hierarchical ring-based shared-memory architecture.
引用
收藏
页码:4 / 16
页数:13
相关论文
共 50 条
  • [31] PROGRAMMING SYSTEM BUILDS MULTIPROCESSOR SOFTWARE
    WILSON, P
    [J]. ELECTRONIC DESIGN, 1983, 31 (15) : 129 - 134
  • [32] DESIGN OF SOFTWARE FOR DISTRIBUTED MULTIPROCESSOR SYSTEMS
    MCKELVEY, TR
    AGRAWAL, DP
    [J]. AFIPS CONFERENCE PROCEEDINGS, 1982, 51 : 239 - &
  • [33] Mining patterns to support software architecture evaluation
    Zhu, LM
    Babar, MA
    Jeffery, R
    [J]. FOURTH WORKING IEEE/IFIP CONFERENCE ON SOFTWARE ARCHITECTURE (WICSA 2004), PROCEEDINGS, 2004, : 25 - 34
  • [34] An Ontology Model to Support the Automated Evaluation of Software
    Garcia-Castro, Raul
    Esteban-Gutierrez, Miguel
    Kerrigan, Mick
    Grimm, Stephan
    [J]. 22ND INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING & KNOWLEDGE ENGINEERING (SEKE 2010), 2010, : 129 - 134
  • [35] Response Latency in Survey-Analyzes. Measurement, Evaluation and Applications
    Deutschmann, Marc
    [J]. METHODS DATA ANALYSES, 2009, 3 (02): : 273 - 274
  • [36] JUMP-DP: A software DSM system with low-latency communication support
    Cheung, BWL
    Wang, CL
    Hwang, K
    [J]. PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED PROCESSING TECHNIQUES AND APPLICATIONS, VOLS I-V, 2000, : 445 - 451
  • [37] Ontological Support for a Measurement and Evaluation Framework
    Olsina, Luis
    Papa, Fernanda
    Molina, Hernan
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2008, 23 (12) : 1282 - 1300
  • [38] Empirical evaluation of an educational game on software measurement
    Christiane Gresse von Wangenheim
    Marcello Thiry
    Djone Kochanski
    [J]. Empirical Software Engineering, 2009, 14 : 418 - 452
  • [39] Validation of software programs for the evaluation of measurement uncertainty
    Greif, Norbert
    Schrepf, Heike
    [J]. TM-TECHNISCHES MESSEN, 2016, 83 (02) : 112 - 120
  • [40] Integrated measurement for the evaluation and improvement of software processes
    García, F
    Ruiz, F
    Cruz, JA
    Piattini, M
    [J]. SOFTWARE PROCESS TECHNOLOGY, PROCEEDINGS, 2003, 2786 : 94 - 111