QoR-Aware Power Capping for Approximate Big Data Processing

被引:0
|
作者
Nabavinejad, Seyed Morteza [1 ,2 ]
Zhan, Xin [1 ]
Azimi, Reza [1 ]
Goudarzi, Maziar [2 ]
Reda, Sherief [1 ]
机构
[1] Brown Univ, Sch Engn, Providence, RI 02912 USA
[2] Sharif Univ Technol, Dept Comp Engn, Tehran, Iran
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To limit the peak power consumption of a cluster, a centralized power capping system typically assigns power caps to the individual servers, which are then enforced using local capping controllers. Consequently, the performance and throughput of the servers are affected, and the runtime of jobs is extended as a result. We observe that servers in big data processing clusters often execute big data applications that have different tolerance for approximate results. To mitigate the impact of power capping, we propose a new power-Capping aware resource manager for Approximate Big data processing (CAB) that takes into consideration the minimum Quality-of Result (QoR) of the jobs. We use industry-standard feedback power capping controllers to enforce a power cap quickly, while, simultaneously modifying the resource allocations to various jobs based on their progress rate, target minimum QoR, and the power cap such that the impact of capping on runtime is minimized Based on the applied cap and the progress rates of jobs, CAB dynamically allocates the computing resources (i.e., number of cores and memory) to the jobs to mitigate the impact of capping on the finish time. We implement CAB in Hadoop-2.7.3 and evaluate its improvement over other methods on a state-of-the-art 28-core Xeon server. We demonstrate that CAB minimizes the impact of power capping on runtime by up to 39.4% while meeting the minimum QoR constraints.
引用
收藏
页码:253 / 256
页数:4
相关论文
共 50 条
  • [21] On Interference-aware Provisioning for Cloud-based Big Data Processing
    Yuan, Yi
    Wang, Haiyang
    Wang, Dan
    Liu, Jiangchuan
    2013 IEEE/ACM 21ST INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE (IWQOS), 2013, : 201 - 206
  • [22] A Resource-Aware Deep Cost Model for Big Data Query Processing
    Li, Yan
    Wang, Liwei
    Wang, Sheng
    Sun, Yuan
    Peng, Zhiyong
    2022 IEEE 38TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2022), 2022, : 885 - 897
  • [23] Map Reduce for big data processing based on traffic aware partition and aggregation
    Venkatesh, G.
    Arunesh, K.
    CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2019, 22 (Suppl 5): : 12909 - 12915
  • [24] Data Locality Aware Computation Offloading in Near Memory Processing Architecture for Big Data Applications
    Maity, Satanu
    Goel, Mayank
    Ghose, Manojit
    2023 IEEE 30TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING, DATA, AND ANALYTICS, HIPC 2023, 2023, : 288 - 297
  • [25] ALPACA: Application Performance Aware Server Power Capping
    Krzywda, Jakub
    Ali-Eldin, Ahmed
    Wadbro, Eddie
    Ostberg, Per-Olov
    Elmroth, Erik
    15TH IEEE INTERNATIONAL CONFERENCE ON AUTONOMIC COMPUTING (ICAC 2018), 2018, : 41 - 50
  • [26] QoS-Aware Approximate Query Processing for Smart Cities Spatial Data Streams
    Al Jawarneh, Isam Mashhour
    Bellavista, Paolo
    Corradi, Antonio
    Foschini, Luca
    Montanari, Rebecca
    SENSORS, 2021, 21 (12)
  • [27] Researches on Data Processing and Data Preventing Technologies in the Environment of Big Data in Power System
    Li, Nige
    Xu, Min
    Cao, Wantian
    Gao, Peng
    2015 5TH INTERNATIONAL CONFERENCE ON ELECTRIC UTILITY DEREGULATION AND RESTRUCTURING AND POWER TECHNOLOGIES (DRPT 2015), 2015, : 2491 - 2494
  • [28] Visual recognition processing of power monitoring data based on big data computing
    Qian, Jianguo
    Zhu, Bingquan
    Li, Ying
    Shi, Zhengchai
    ENERGY REPORTS, 2021, 7 : 645 - 657
  • [29] Approximate Computation for Big Data Analytics
    Ma, Shuai
    DATABASES THEORY AND APPLICATIONS, ADC 2018, 2018, 10837 : XVIII - XVIII
  • [30] Big Data Needs Approximate Computing
    Nair, Ravi
    COMMUNICATIONS OF THE ACM, 2015, 58 (01) : 104 - 104