EAIS: Energy-aware adaptive scheduling for CNN inference on high-performance GPUs

被引:9
|
作者
Yao, Chunrong [1 ]
Liu, Wantao [2 ]
Tang, Weiqing [1 ,3 ]
Hu, Songlin [2 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210094, Peoples R China
[2] Chinese Acad Sci, Inst Informat Engn, Beijing 100093, Peoples R China
[3] Chinese Acad Sci, Inst Comp Technol, Beijing 100190, Peoples R China
关键词
Energy-aware; Convolutional neural network (CNN) inference; High-performance GPUs; Workload scheduling; Service-Level-Objective (SLO); DVFS; ALGORITHM;
D O I
10.1016/j.future.2022.01.004
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Recently, a large number of convolutional neural network (CNN) inference services have emerged on high-performance Graphic Processing Units (GPUs). However, GPUs are high power consumption units, and the energy consumption increases sharply along with the deployment of deep learning tasks. Although previous studies have considered the latency Service-Level-Objective (SLO) of inference services, they fail to directly take account of the energy consumption. Our investigation shows that coordinating batching and dynamic voltage frequency scaling (DVFS) settings can decrease the energy consumption of CNN inference. But it is affected by (i) larger configuration spaces; (ii) GPUs' underutilization while data are transferred between CPUs and GPUs; (iii) fluctuating workloads. In this paper, we propose EAIS, an energy-aware adaptive scheduling framework that is comprised of a performance model, an asynchronous execution strategy, and an energy-aware scheduler. The performance model provides valid information about the performance characteristics of CNN inference services to shrink the feasible configuration space. The asynchronous execution strategy overlaps data upload and GPU execution to improve the system processing capacity. The energy-aware scheduler adaptively coordinates batching and DVFS according to fluctuating workloads to minimize energy consumption while meeting latency SLO. Our experimental results on NVIDIA Tesla M40 and V100 GPUs show that, compared to the state-of-the-art methods, EAIS decreases the energy consumption by up to 28.02% and improves the system processing capacity by up to 7.22% while meeting latency SLO. Besides, EAIS has been proved to have good versatility under different latency SLO constraints. (C) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页码:253 / 268
页数:16
相关论文
共 50 条
  • [31] SparseFT: Sparsity-aware Fault Tolerance for Reliable CNN Inference on GPUs
    Byeon, Gwangeun
    Lee, Seungtae
    Kim, Seongwook
    Kim, Yongjun
    Nair, Prashant J.
    Hong, Seokin
    [J]. 2023 32ND INTERNATIONAL CONFERENCE ON PARALLEL ARCHITECTURES AND COMPILATION TECHNIQUES, PACT, 2023, : 337 - 338
  • [32] Florets for Chiplets: Data Flow-aware High-Performance and Energy-efficient Network-on-Interposer for CNN Inference Tasks
    Sharma, Harsh
    Pfromm, Lukas
    Topaloglu, Rasit Onur
    Doppa, Janardhan Rao
    Ogras, Umit Y.
    Kalyanraman, Ananth
    Pande, Partha Pratim
    [J]. ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [33] Energy-aware scheduling in cloud computing systems
    Tomas Cotes-Ruiz, Ivan
    Prado, Rocio P.
    Garcia-Galan, Sebastian
    Enrique Munoz-Exposito, Jose
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE), 2017,
  • [34] Energy-Aware Scheduling for Sensor Node Platforms
    Tak, Sungwoo
    Kim, Hangeul
    Kim, Donglyul
    Kim, Yougyung
    [J]. 2014 15TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED COMPUTING, APPLICATIONS AND TECHNOLOGIES (PDCAT 2014), 2014, : 61 - 68
  • [35] Energy-Aware Scheduling of Tasks in Cloud Computing
    Mehor, Yamina
    Rebbah, Mohammed
    Smail, Omar
    [J]. Informatica (Slovenia), 2024, 48 (16): : 125 - 136
  • [36] Energy-aware parallel task scheduling in a cluster
    Wang, Lizhe
    Khan, Samee U.
    Chen, Dan
    Kolodziej, Joanna
    Ranjan, Rajiv
    Xu, Cheng-zhong
    Zomaya, Albert
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2013, 29 (07): : 1661 - 1670
  • [37] Towards Energy-aware Scheduling of Scientific Workflows
    Warade, Mehul
    Schneider, Jean-Guy
    Lee, Kevin
    [J]. 2022 INTERNATIONAL CONFERENCE ON GREEN ENERGY, COMPUTING AND SUSTAINABLE TECHNOLOGY (GECOST), 2022, : 93 - 98
  • [38] Energy-aware Scheduling Algorithms for Network Stability
    Andrews, Matthew
    Antonakopoulos, Spyridon
    Zhang, Lisa
    [J]. 2011 PROCEEDINGS IEEE INFOCOM, 2011, : 1359 - 1367
  • [39] Energy-aware production scheduling for additive manufacturing
    Karimi, Sajad
    Kwon, Soongeol
    Ning, Fuda
    [J]. JOURNAL OF CLEANER PRODUCTION, 2021, 278
  • [40] Energy-aware Scheduling in Transactional Memory Systems
    Marques Junior, Ademir
    Baldassin, Alexandro
    [J]. 2016 29TH SYMPOSIUM ON INTEGRATED CIRCUITS AND SYSTEMS DESIGN (SBCCI), 2016,