On the Performance Evaluation of IaaS Cloud Services With System-Level Benchmarks

被引:3
|
作者
Ahuja, Sanjay P. [1 ]
Deval, Niharika [2 ]
机构
[1] Univ North Florida, Sch Comp, Comp & Informat Sci, Jacksonville, FL 32224 USA
[2] Univ North Florida, Sch Comp, Jacksonville, FL USA
关键词
Benchmarking; Infrastructure-as-a-Service; Performance; Performance Variability;
D O I
10.4018/IJCAC.2018010104
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Infrastructure-as-a-service is a cloud service model that allows customers to outsource computing resources such as servers and storage. This article evaluates four IaaS cloud services - Amazon EC2, Microsoft Azure, Google Compute Engine and Rackspace Cloud in a vendor-neutral approach with regards to system parameter usage including server, file I/O and network utilization. Thus, system-level benchmarking provides objective comparison of cloud providers from performance standpoint. Unixbench, Dbench and Iperf are the System-level benchmarks chosen to test the performance of server, file I/O and network respectively. In order to capture the variation in performance, the tests were performed at different times on weekdays and weekends. With each offering, the benchmarks are tested on different configurations to provide an insight to the cloud users in selection of provider followed by appropriate VM sizing according to the workload requirement. In addition to the performance evaluation, price-per-performance value of all the providers is also examined and compared.
引用
收藏
页码:80 / 96
页数:17
相关论文
共 50 条
  • [21] System-level Performance of Interference Alignment
    Mungara, Ratheesh K.
    Morales-Jimenez, David
    Lozano, Angel
    [J]. 2014 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2014), 2014, : 1673 - 1678
  • [22] System-level performance analysis in SystemC
    Posadas, H
    Herrera, F
    Sánchez, P
    Villar, E
    Blasco, F
    [J]. DESIGN, AUTOMATION AND TEST IN EUROPE CONFERENCE AND EXHIBITION, VOLS 1 AND 2, PROCEEDINGS, 2004, : 378 - 383
  • [23] System-Level Performance of Interference Alignment
    Mungara, Ratheesh K.
    Morales-Jimenez, David
    Lozano, Angel
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2015, 14 (02) : 1060 - 1070
  • [24] System-level power evaluation metrics
    Fornaciari, W
    Gubian, P
    Sciuto, D
    Silvano, C
    [J]. SECOND ANNUAL IEEE INTERNATIONAL CONFERENCE ON INNOVATIVE SYSTEMS IN SILICON, 1997 PROCEEDINGS, 1997, : 323 - 330
  • [25] Performance Evaluation and Modeling Method Research Based on IaaS Cloud Platform
    Wan, Jian
    Yang, Xianghong
    Ren, Zujie
    Ye, Zheng
    [J]. INTERNATIONAL JOURNAL OF GRID AND DISTRIBUTED COMPUTING, 2016, 9 (10): : 141 - 152
  • [26] Modeling and Evaluating IaaS Cloud Using Performance Evaluation Process Algebra
    Ding, Jie
    Sha, Leijie
    Chen, Xiao
    [J]. 2016 22ND ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS (APCC), 2016, : 243 - 247
  • [27] System-Level Performance Evaluation for 5G mmWave Cellular Network
    Busari, Sherif Adeshina
    Mumtaz, Shahid
    Huq, Kazi Mohammed Saidul
    Rodriguez, Jonathan
    Gacanin, Haris
    [J]. GLOBECOM 2017 - 2017 IEEE GLOBAL COMMUNICATIONS CONFERENCE, 2017,
  • [28] High Level Models for IaaS Cloud Architectures
    Komarek, Ales
    Pavlik, Jakub
    Sobeslav, Vladimir
    [J]. NEW TRENDS IN INTELLIGENT INFORMATION AND DATABASE SYSTEMS, 2015, 598 : 209 - 218
  • [29] System-Level Performance Evaluation of HSDPA/HSUPA with TCP-based Application
    Marinier, Paul
    Cuffaro, Angelo
    Touag, Athmane
    [J]. 2006 IEEE 64TH VEHICULAR TECHNOLOGY CONFERENCE, VOLS 1-6, 2006, : 2262 - 2266
  • [30] System-level modeling and performance evaluation of multistage optical network on chips (MONoCs)
    Bai, Luying
    Gu, Huaxi
    Chen, Yawen
    Zhang, Haibo
    Xu, Xinyao
    Yang, Yintang
    [J]. PHOTONIC NETWORK COMMUNICATIONS, 2017, 34 (01) : 25 - 33