On the Performance Evaluation of IaaS Cloud Services With System-Level Benchmarks

被引:3
|
作者
Ahuja, Sanjay P. [1 ]
Deval, Niharika [2 ]
机构
[1] Univ North Florida, Sch Comp, Comp & Informat Sci, Jacksonville, FL 32224 USA
[2] Univ North Florida, Sch Comp, Jacksonville, FL USA
关键词
Benchmarking; Infrastructure-as-a-Service; Performance; Performance Variability;
D O I
10.4018/IJCAC.2018010104
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Infrastructure-as-a-service is a cloud service model that allows customers to outsource computing resources such as servers and storage. This article evaluates four IaaS cloud services - Amazon EC2, Microsoft Azure, Google Compute Engine and Rackspace Cloud in a vendor-neutral approach with regards to system parameter usage including server, file I/O and network utilization. Thus, system-level benchmarking provides objective comparison of cloud providers from performance standpoint. Unixbench, Dbench and Iperf are the System-level benchmarks chosen to test the performance of server, file I/O and network respectively. In order to capture the variation in performance, the tests were performed at different times on weekdays and weekends. With each offering, the benchmarks are tested on different configurations to provide an insight to the cloud users in selection of provider followed by appropriate VM sizing according to the workload requirement. In addition to the performance evaluation, price-per-performance value of all the providers is also examined and compared.
引用
收藏
页码:80 / 96
页数:17
相关论文
共 50 条
  • [1] System-level performance evaluation of winner system
    Safjan, Krystian
    Oszmianski, Jakub
    Bohdanowicz, Adrian
    Doettling, Martin
    [J]. 2008 INTERNATIONAL ITG WORKSHOP ON SMART ANTENNAS, 2008, : 241 - +
  • [2] On the Conceptualization of Performance Evaluation of IaaS Services
    Li, Zheng
    O'Brien, Liam
    Zhang, He
    Cai, Rainbow
    [J]. IEEE TRANSACTIONS ON SERVICES COMPUTING, 2014, 7 (04) : 628 - 641
  • [3] System-level performance evaluation of reconfigurable processors
    Enzler, R
    Plessl, C
    Platzner, M
    [J]. MICROPROCESSORS AND MICROSYSTEMS, 2005, 29 (2-3) : 63 - 73
  • [4] Performance Evaluation of an IaaS Opportunistic Cloud Computing
    Diaz, Cesar O.
    Pecero, Johnatan E.
    Bouvry, Pascal
    Sotelo, German
    Villamizar, Mario
    Castro, Harold
    [J]. 2014 14TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND GRID COMPUTING (CCGRID), 2014, : 546 - 547
  • [5] System-level key performance indicators for building performance evaluation
    Li, Han
    Hong, Tianzhen
    Lee, Sang Hoon
    Sofos, Marina
    [J]. ENERGY AND BUILDINGS, 2020, 209
  • [6] LoRaWAN: Evaluation of Link- and System-Level Performance
    Feltrin, Luca
    Buratti, Chiara
    Vinciarelli, Enrico
    De Bonis, Roberto
    Verdone, Roberto
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2018, 5 (03): : 2249 - 2258
  • [7] A Quality-Driven Recommender System for IaaS Cloud Services
    Al-Masri, Eyhab
    Meng, Lingwei
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2018, : 5288 - 5290
  • [8] StackAct: Performance Evaluation in an IaaS Cloud Multilayer.
    Bruschi, Gustavo C.
    Spolon, Roberta
    Pauro, Leandro L.
    Lobato, Renata S.
    Manacero, Aleardo, Jr.
    Cavenaghi, Marcos A.
    [J]. 2016 15TH INTERNATIONAL SYMPOSIUM ON PARALLEL AND DISTRIBUTED COMPUTING (ISPDC), 2016, : 149 - 156
  • [9] System-level Key Performance Indicators (KPIs) for Building Performance Evaluation
    Li, Han
    Hong, Tianzhen
    Sofos, Marina
    [J]. ASHRAE TRANSACTIONS 2019, VOL 125, PT 1, 2019, 125 : 453 - 460
  • [10] System-level Max Power (SYMPO) - A Systematic Approach for Escalating System-level Power Consumption using Synthetic Benchmarks
    Ganesan, Karthik
    Jo, Jungho
    Bircher, W. Lloyd
    Kaseridis, Dimitris
    Yu, Zhibin
    John, Lizy K.
    [J]. PACT 2010: PROCEEDINGS OF THE NINETEENTH INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES, 2010, : 19 - 28